Oct 08 22:23:09 crc systemd[1]: Starting Kubernetes Kubelet... Oct 08 22:23:10 crc restorecon[4730]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:10 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 22:23:11 crc restorecon[4730]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 08 22:23:11 crc restorecon[4730]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 08 22:23:13 crc kubenswrapper[4834]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 22:23:13 crc kubenswrapper[4834]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 08 22:23:13 crc kubenswrapper[4834]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 22:23:13 crc kubenswrapper[4834]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 22:23:13 crc kubenswrapper[4834]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 08 22:23:13 crc kubenswrapper[4834]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.236282 4834 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243126 4834 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243181 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243193 4834 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243203 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243211 4834 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243219 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243228 4834 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243236 4834 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243243 4834 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243251 4834 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243258 4834 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243266 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243274 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243282 4834 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243289 4834 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243297 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243304 4834 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243315 4834 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243326 4834 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243343 4834 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243353 4834 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243364 4834 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243373 4834 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243382 4834 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243390 4834 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243397 4834 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243406 4834 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243414 4834 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243421 4834 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243429 4834 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243437 4834 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243444 4834 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243452 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243459 4834 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243467 4834 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243475 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243482 4834 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243490 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243498 4834 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243505 4834 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243513 4834 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243520 4834 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243528 4834 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243536 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243544 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243552 4834 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243562 4834 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243569 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243577 4834 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243588 4834 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243597 4834 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243605 4834 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243613 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243621 4834 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243630 4834 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243639 4834 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243647 4834 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243654 4834 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243662 4834 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243670 4834 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243677 4834 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243687 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243694 4834 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243702 4834 feature_gate.go:330] unrecognized feature gate: Example Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243710 4834 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243717 4834 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243725 4834 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243734 4834 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243742 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243749 4834 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.243757 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244652 4834 flags.go:64] FLAG: --address="0.0.0.0" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244678 4834 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244697 4834 flags.go:64] FLAG: --anonymous-auth="true" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244709 4834 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244721 4834 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244730 4834 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244742 4834 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244752 4834 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244762 4834 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244771 4834 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244781 4834 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244790 4834 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244799 4834 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244808 4834 flags.go:64] FLAG: --cgroup-root="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244817 4834 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244826 4834 flags.go:64] FLAG: --client-ca-file="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244837 4834 flags.go:64] FLAG: --cloud-config="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244846 4834 flags.go:64] FLAG: --cloud-provider="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244855 4834 flags.go:64] FLAG: --cluster-dns="[]" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244865 4834 flags.go:64] FLAG: --cluster-domain="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244874 4834 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244883 4834 flags.go:64] FLAG: --config-dir="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244891 4834 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244900 4834 flags.go:64] FLAG: --container-log-max-files="5" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244912 4834 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244921 4834 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244930 4834 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244939 4834 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244948 4834 flags.go:64] FLAG: --contention-profiling="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244957 4834 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244966 4834 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244975 4834 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244985 4834 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.244996 4834 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245004 4834 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245014 4834 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245022 4834 flags.go:64] FLAG: --enable-load-reader="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245031 4834 flags.go:64] FLAG: --enable-server="true" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245040 4834 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245053 4834 flags.go:64] FLAG: --event-burst="100" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245062 4834 flags.go:64] FLAG: --event-qps="50" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245071 4834 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245080 4834 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245089 4834 flags.go:64] FLAG: --eviction-hard="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245099 4834 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245108 4834 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245117 4834 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245127 4834 flags.go:64] FLAG: --eviction-soft="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245136 4834 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245172 4834 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245182 4834 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245191 4834 flags.go:64] FLAG: --experimental-mounter-path="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245199 4834 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245208 4834 flags.go:64] FLAG: --fail-swap-on="true" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245216 4834 flags.go:64] FLAG: --feature-gates="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245256 4834 flags.go:64] FLAG: --file-check-frequency="20s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245267 4834 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245277 4834 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245286 4834 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245295 4834 flags.go:64] FLAG: --healthz-port="10248" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245304 4834 flags.go:64] FLAG: --help="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245313 4834 flags.go:64] FLAG: --hostname-override="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245322 4834 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245331 4834 flags.go:64] FLAG: --http-check-frequency="20s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245340 4834 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245349 4834 flags.go:64] FLAG: --image-credential-provider-config="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245359 4834 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245367 4834 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245378 4834 flags.go:64] FLAG: --image-service-endpoint="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245386 4834 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245395 4834 flags.go:64] FLAG: --kube-api-burst="100" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245404 4834 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245414 4834 flags.go:64] FLAG: --kube-api-qps="50" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245423 4834 flags.go:64] FLAG: --kube-reserved="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245432 4834 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245440 4834 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245449 4834 flags.go:64] FLAG: --kubelet-cgroups="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245458 4834 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245467 4834 flags.go:64] FLAG: --lock-file="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245476 4834 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245485 4834 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245494 4834 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245507 4834 flags.go:64] FLAG: --log-json-split-stream="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245517 4834 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245526 4834 flags.go:64] FLAG: --log-text-split-stream="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245535 4834 flags.go:64] FLAG: --logging-format="text" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245544 4834 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245554 4834 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245562 4834 flags.go:64] FLAG: --manifest-url="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245571 4834 flags.go:64] FLAG: --manifest-url-header="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245582 4834 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245592 4834 flags.go:64] FLAG: --max-open-files="1000000" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245603 4834 flags.go:64] FLAG: --max-pods="110" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245611 4834 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245621 4834 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245630 4834 flags.go:64] FLAG: --memory-manager-policy="None" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245638 4834 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245649 4834 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245658 4834 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245667 4834 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245687 4834 flags.go:64] FLAG: --node-status-max-images="50" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245696 4834 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245705 4834 flags.go:64] FLAG: --oom-score-adj="-999" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245715 4834 flags.go:64] FLAG: --pod-cidr="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245725 4834 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245738 4834 flags.go:64] FLAG: --pod-manifest-path="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245747 4834 flags.go:64] FLAG: --pod-max-pids="-1" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245757 4834 flags.go:64] FLAG: --pods-per-core="0" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245765 4834 flags.go:64] FLAG: --port="10250" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245775 4834 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245783 4834 flags.go:64] FLAG: --provider-id="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245793 4834 flags.go:64] FLAG: --qos-reserved="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245802 4834 flags.go:64] FLAG: --read-only-port="10255" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245811 4834 flags.go:64] FLAG: --register-node="true" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245820 4834 flags.go:64] FLAG: --register-schedulable="true" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245828 4834 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245843 4834 flags.go:64] FLAG: --registry-burst="10" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245851 4834 flags.go:64] FLAG: --registry-qps="5" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245861 4834 flags.go:64] FLAG: --reserved-cpus="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245870 4834 flags.go:64] FLAG: --reserved-memory="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245880 4834 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245889 4834 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245898 4834 flags.go:64] FLAG: --rotate-certificates="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245908 4834 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245916 4834 flags.go:64] FLAG: --runonce="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245925 4834 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245934 4834 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245943 4834 flags.go:64] FLAG: --seccomp-default="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245952 4834 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245961 4834 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245976 4834 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245986 4834 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.245995 4834 flags.go:64] FLAG: --storage-driver-password="root" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246004 4834 flags.go:64] FLAG: --storage-driver-secure="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246014 4834 flags.go:64] FLAG: --storage-driver-table="stats" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246022 4834 flags.go:64] FLAG: --storage-driver-user="root" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246031 4834 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246040 4834 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246049 4834 flags.go:64] FLAG: --system-cgroups="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246058 4834 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246073 4834 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246082 4834 flags.go:64] FLAG: --tls-cert-file="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246090 4834 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246101 4834 flags.go:64] FLAG: --tls-min-version="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246109 4834 flags.go:64] FLAG: --tls-private-key-file="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246118 4834 flags.go:64] FLAG: --topology-manager-policy="none" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246127 4834 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246136 4834 flags.go:64] FLAG: --topology-manager-scope="container" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246169 4834 flags.go:64] FLAG: --v="2" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246181 4834 flags.go:64] FLAG: --version="false" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246199 4834 flags.go:64] FLAG: --vmodule="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246209 4834 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.246219 4834 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246421 4834 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246432 4834 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246442 4834 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246452 4834 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246461 4834 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246469 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246477 4834 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246485 4834 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246492 4834 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246504 4834 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246512 4834 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246519 4834 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246527 4834 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246535 4834 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246542 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246550 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246557 4834 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246565 4834 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246573 4834 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246580 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246588 4834 feature_gate.go:330] unrecognized feature gate: Example Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246596 4834 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246603 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246612 4834 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246620 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246628 4834 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246635 4834 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246643 4834 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246653 4834 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246663 4834 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246673 4834 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246683 4834 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246692 4834 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246701 4834 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246709 4834 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246717 4834 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246725 4834 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246733 4834 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246741 4834 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246749 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246756 4834 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246767 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246774 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246782 4834 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246790 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246800 4834 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246809 4834 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246817 4834 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246825 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246833 4834 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246840 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246848 4834 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246855 4834 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246863 4834 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246871 4834 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246878 4834 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246886 4834 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246893 4834 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246902 4834 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246912 4834 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246920 4834 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246928 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246936 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246944 4834 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246952 4834 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246960 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246968 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246975 4834 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246983 4834 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.246994 4834 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.247004 4834 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.249680 4834 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.262389 4834 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.262431 4834 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262554 4834 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262566 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262576 4834 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262585 4834 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262593 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262601 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262609 4834 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262617 4834 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262625 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262633 4834 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262641 4834 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262648 4834 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262656 4834 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262664 4834 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262674 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262685 4834 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262695 4834 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262704 4834 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262713 4834 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262721 4834 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262730 4834 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262737 4834 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262745 4834 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262753 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262761 4834 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262768 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262776 4834 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262783 4834 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262792 4834 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262799 4834 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262807 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262815 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262822 4834 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262830 4834 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262838 4834 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262849 4834 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262858 4834 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262866 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262874 4834 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262884 4834 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262895 4834 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262904 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262912 4834 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262920 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262928 4834 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262935 4834 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262944 4834 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262951 4834 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262961 4834 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262971 4834 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262980 4834 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262989 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.262997 4834 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263005 4834 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263013 4834 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263021 4834 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263028 4834 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263036 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263045 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263054 4834 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263061 4834 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263069 4834 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263076 4834 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263084 4834 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263092 4834 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263100 4834 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263107 4834 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263115 4834 feature_gate.go:330] unrecognized feature gate: Example Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263122 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263130 4834 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263138 4834 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.263316 4834 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263540 4834 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263565 4834 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263574 4834 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263582 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263590 4834 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263601 4834 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263611 4834 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263622 4834 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263631 4834 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263639 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263647 4834 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263655 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263663 4834 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263671 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263679 4834 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263687 4834 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263695 4834 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263702 4834 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263711 4834 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263720 4834 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263727 4834 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263735 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263742 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263750 4834 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263758 4834 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263768 4834 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263777 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263786 4834 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263795 4834 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263806 4834 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263814 4834 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263822 4834 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263830 4834 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263838 4834 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263846 4834 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263854 4834 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263861 4834 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263869 4834 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263876 4834 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263884 4834 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263892 4834 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263900 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263907 4834 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263916 4834 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263925 4834 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263933 4834 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263941 4834 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263948 4834 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263956 4834 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263964 4834 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263974 4834 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263983 4834 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263990 4834 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.263998 4834 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264005 4834 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264013 4834 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264021 4834 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264028 4834 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264037 4834 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264044 4834 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264053 4834 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264060 4834 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264068 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264075 4834 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264083 4834 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264091 4834 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264098 4834 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264106 4834 feature_gate.go:330] unrecognized feature gate: Example Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264113 4834 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264121 4834 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.264129 4834 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.264141 4834 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.265310 4834 server.go:940] "Client rotation is on, will bootstrap in background" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.271364 4834 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.271494 4834 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.273531 4834 server.go:997] "Starting client certificate rotation" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.273580 4834 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.273867 4834 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-07 11:17:22.048135064 +0000 UTC Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.274072 4834 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 708h54m8.774069535s for next certificate rotation Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.314888 4834 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.317624 4834 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.356872 4834 log.go:25] "Validated CRI v1 runtime API" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.408510 4834 log.go:25] "Validated CRI v1 image API" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.411203 4834 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.419094 4834 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-08-22-18-27-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.419138 4834 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.440115 4834 manager.go:217] Machine: {Timestamp:2025-10-08 22:23:13.43741769 +0000 UTC m=+1.260302486 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:df65835b-239f-4335-94d8-90a9d75b5252 BootID:d21aca01-445d-4de9-b847-f69c4d8c7264 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:cd:d9:ff Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:cd:d9:ff Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:57:ac:b1 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:3e:e4:2a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:52:0b:02 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:66:0c:af Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:e9:94:5e Speed:-1 Mtu:1496} {Name:eth10 MacAddress:0e:4d:60:0b:8a:30 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:6a:77:ab:b2:8c:6f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.440412 4834 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.440603 4834 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.441227 4834 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.441587 4834 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.441645 4834 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.442608 4834 topology_manager.go:138] "Creating topology manager with none policy" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.442641 4834 container_manager_linux.go:303] "Creating device plugin manager" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.443338 4834 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.443433 4834 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.444278 4834 state_mem.go:36] "Initialized new in-memory state store" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.444431 4834 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.447762 4834 kubelet.go:418] "Attempting to sync node with API server" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.447799 4834 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.447836 4834 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.447858 4834 kubelet.go:324] "Adding apiserver pod source" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.447878 4834 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.452834 4834 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.453954 4834 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.456626 4834 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.458385 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.458430 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.458446 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.458459 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.458482 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.458496 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.458511 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.458532 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.458550 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.458567 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.458585 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.458600 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.459373 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.23:6443: connect: connection refused Oct 08 22:23:13 crc kubenswrapper[4834]: E1008 22:23:13.459458 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.23:6443: connect: connection refused" logger="UnhandledError" Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.459596 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.23:6443: connect: connection refused Oct 08 22:23:13 crc kubenswrapper[4834]: E1008 22:23:13.459633 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.23:6443: connect: connection refused" logger="UnhandledError" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.459736 4834 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.460622 4834 server.go:1280] "Started kubelet" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.465401 4834 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.23:6443: connect: connection refused Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.465759 4834 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 08 22:23:13 crc systemd[1]: Started Kubernetes Kubelet. Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.467997 4834 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.469760 4834 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.469771 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.470010 4834 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.470075 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 02:23:59.237209686 +0000 UTC Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.470199 4834 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 916h0m45.767018996s for next certificate rotation Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.470285 4834 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.470317 4834 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.470436 4834 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 08 22:23:13 crc kubenswrapper[4834]: E1008 22:23:13.470671 4834 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.475820 4834 server.go:460] "Adding debug handlers to kubelet server" Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.476810 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.23:6443: connect: connection refused Oct 08 22:23:13 crc kubenswrapper[4834]: E1008 22:23:13.476919 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.23:6443: connect: connection refused" logger="UnhandledError" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.478837 4834 factory.go:55] Registering systemd factory Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.478896 4834 factory.go:221] Registration of the systemd container factory successfully Oct 08 22:23:13 crc kubenswrapper[4834]: E1008 22:23:13.479475 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.23:6443: connect: connection refused" interval="200ms" Oct 08 22:23:13 crc kubenswrapper[4834]: E1008 22:23:13.478554 4834 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.23:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186ca44a4f55b0df default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-08 22:23:13.460572383 +0000 UTC m=+1.283457159,LastTimestamp:2025-10-08 22:23:13.460572383 +0000 UTC m=+1.283457159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.480989 4834 factory.go:153] Registering CRI-O factory Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.481029 4834 factory.go:221] Registration of the crio container factory successfully Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.481191 4834 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.481232 4834 factory.go:103] Registering Raw factory Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.481261 4834 manager.go:1196] Started watching for new ooms in manager Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.482283 4834 manager.go:319] Starting recovery of all containers Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491038 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491141 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491207 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491240 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491267 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491296 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491321 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491352 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491383 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491409 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491435 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491461 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491486 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491516 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491594 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491627 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491656 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491693 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491720 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491750 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491776 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491810 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491837 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491862 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491892 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491917 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491947 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.491974 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492000 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492029 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492060 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492090 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492124 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492190 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492219 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492248 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492275 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492306 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492333 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492363 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492394 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492422 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492451 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492478 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492506 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492539 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492570 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492598 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492628 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492657 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492685 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492715 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492801 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492836 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492868 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492898 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492927 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492957 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.492987 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493017 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493045 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493072 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493098 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493126 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493194 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493224 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493252 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493284 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493311 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493339 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493372 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493403 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493432 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493462 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493489 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493519 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493551 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493586 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493612 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493639 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493666 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493690 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493716 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493742 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493769 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493795 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493823 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493851 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493878 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493909 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493938 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.493970 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494000 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494028 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494053 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494079 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494111 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494138 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494203 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494232 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494258 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494288 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494317 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494344 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494387 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494417 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494446 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494476 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494509 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494538 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494571 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494599 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494631 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494660 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494686 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494716 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494741 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494764 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494786 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494808 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494831 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494857 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494880 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494905 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494931 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.494957 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.498129 4834 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.498773 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.500472 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.500643 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.500828 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.500984 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.501173 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.501341 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.501495 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.501636 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.501784 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.501971 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.502220 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.502410 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.502589 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.502752 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.502932 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.503090 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.503262 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.503430 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.503610 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.503762 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.503923 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.504064 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.505065 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.505280 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.505515 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.505674 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508033 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508109 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508128 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508171 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508192 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508208 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508225 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508244 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508261 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508277 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508294 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508315 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508335 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508356 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508371 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508384 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508398 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508419 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508431 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508445 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508458 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508471 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508484 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508504 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508519 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508572 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508587 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508601 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508618 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508632 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508645 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508658 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508670 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508682 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508722 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508735 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508748 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508763 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508777 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508792 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508806 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508819 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508864 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508879 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508897 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508913 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508952 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508968 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508984 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.508998 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.509011 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.509024 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.509039 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.509053 4834 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.509066 4834 reconstruct.go:97] "Volume reconstruction finished" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.509075 4834 reconciler.go:26] "Reconciler: start to sync state" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.514438 4834 manager.go:324] Recovery completed Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.528927 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.530645 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.530688 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.530702 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.531473 4834 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.531492 4834 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.531513 4834 state_mem.go:36] "Initialized new in-memory state store" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.550933 4834 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.554114 4834 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.554201 4834 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.554241 4834 kubelet.go:2335] "Starting kubelet main sync loop" Oct 08 22:23:13 crc kubenswrapper[4834]: E1008 22:23:13.554466 4834 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 08 22:23:13 crc kubenswrapper[4834]: W1008 22:23:13.555463 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.23:6443: connect: connection refused Oct 08 22:23:13 crc kubenswrapper[4834]: E1008 22:23:13.555592 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.23:6443: connect: connection refused" logger="UnhandledError" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.562805 4834 policy_none.go:49] "None policy: Start" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.563936 4834 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.564026 4834 state_mem.go:35] "Initializing new in-memory state store" Oct 08 22:23:13 crc kubenswrapper[4834]: E1008 22:23:13.571192 4834 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.639737 4834 manager.go:334] "Starting Device Plugin manager" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.639827 4834 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.639846 4834 server.go:79] "Starting device plugin registration server" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.640634 4834 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.641118 4834 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.641306 4834 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.641511 4834 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.641525 4834 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.655023 4834 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.655166 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:13 crc kubenswrapper[4834]: E1008 22:23:13.655900 4834 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.657273 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.657326 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.657346 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.657567 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.657956 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.658053 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.658933 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.658980 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.659000 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.659213 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.659339 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.659408 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.659836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.659902 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.659929 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.660557 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.660636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.660658 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.660977 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.661030 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.661074 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.660980 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.661192 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.661218 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.662370 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.662392 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.662425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.662405 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.662447 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.662491 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.662717 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.662801 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.662865 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.664338 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.664372 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.664431 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.664375 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.664449 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.664478 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.664743 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.664803 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.666008 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.666060 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.666079 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:13 crc kubenswrapper[4834]: E1008 22:23:13.680884 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.23:6443: connect: connection refused" interval="400ms" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.712394 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.712458 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.712505 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.712539 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.712579 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.712611 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.712641 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.712675 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.712709 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.712745 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.712806 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.712917 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.713077 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.713196 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.713236 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.741751 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.744187 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.744259 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.744282 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.744325 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 22:23:13 crc kubenswrapper[4834]: E1008 22:23:13.745545 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.23:6443: connect: connection refused" node="crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.814542 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.814655 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.814701 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.814749 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.814781 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.814858 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.814790 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.814910 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.814947 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.814965 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.814917 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815027 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815113 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815089 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815205 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815305 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815346 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815357 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815387 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815424 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815439 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815461 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815472 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815505 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815533 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815546 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815556 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815567 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815665 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.815780 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.946515 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.948225 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.948320 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.948344 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:13 crc kubenswrapper[4834]: I1008 22:23:13.948414 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 22:23:13 crc kubenswrapper[4834]: E1008 22:23:13.949058 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.23:6443: connect: connection refused" node="crc" Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.003766 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.015988 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.042668 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.057348 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.063579 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 22:23:14 crc kubenswrapper[4834]: W1008 22:23:14.071340 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ce238890770e17211d60a73495c464fc75bfa6742ecc2043bd7e558ef6a78c6c WatchSource:0}: Error finding container ce238890770e17211d60a73495c464fc75bfa6742ecc2043bd7e558ef6a78c6c: Status 404 returned error can't find the container with id ce238890770e17211d60a73495c464fc75bfa6742ecc2043bd7e558ef6a78c6c Oct 08 22:23:14 crc kubenswrapper[4834]: W1008 22:23:14.077090 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-5ef8b10170defbb46c458d72f212d030e4d23b67dc695031d2cd88a6cd6079fe WatchSource:0}: Error finding container 5ef8b10170defbb46c458d72f212d030e4d23b67dc695031d2cd88a6cd6079fe: Status 404 returned error can't find the container with id 5ef8b10170defbb46c458d72f212d030e4d23b67dc695031d2cd88a6cd6079fe Oct 08 22:23:14 crc kubenswrapper[4834]: E1008 22:23:14.081579 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.23:6443: connect: connection refused" interval="800ms" Oct 08 22:23:14 crc kubenswrapper[4834]: W1008 22:23:14.092604 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-005843575e18695dff6161a9eb9e5b43e7edca3ea4f1d0d2a35129cf40fd3177 WatchSource:0}: Error finding container 005843575e18695dff6161a9eb9e5b43e7edca3ea4f1d0d2a35129cf40fd3177: Status 404 returned error can't find the container with id 005843575e18695dff6161a9eb9e5b43e7edca3ea4f1d0d2a35129cf40fd3177 Oct 08 22:23:14 crc kubenswrapper[4834]: W1008 22:23:14.101477 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-6c8ce2d6083404a3faab9399a8846814566097bd33b68071a586c5dd21b73f44 WatchSource:0}: Error finding container 6c8ce2d6083404a3faab9399a8846814566097bd33b68071a586c5dd21b73f44: Status 404 returned error can't find the container with id 6c8ce2d6083404a3faab9399a8846814566097bd33b68071a586c5dd21b73f44 Oct 08 22:23:14 crc kubenswrapper[4834]: W1008 22:23:14.103736 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-7f8d95317246b9367ef0338fc53ae3faf028fa3fb1ee70c08635d8bb945ec0ad WatchSource:0}: Error finding container 7f8d95317246b9367ef0338fc53ae3faf028fa3fb1ee70c08635d8bb945ec0ad: Status 404 returned error can't find the container with id 7f8d95317246b9367ef0338fc53ae3faf028fa3fb1ee70c08635d8bb945ec0ad Oct 08 22:23:14 crc kubenswrapper[4834]: W1008 22:23:14.297300 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.23:6443: connect: connection refused Oct 08 22:23:14 crc kubenswrapper[4834]: E1008 22:23:14.297451 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.23:6443: connect: connection refused" logger="UnhandledError" Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.350044 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.352085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.352215 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.352239 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.352337 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 22:23:14 crc kubenswrapper[4834]: E1008 22:23:14.353003 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.23:6443: connect: connection refused" node="crc" Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.466944 4834 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.23:6443: connect: connection refused Oct 08 22:23:14 crc kubenswrapper[4834]: W1008 22:23:14.466968 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.23:6443: connect: connection refused Oct 08 22:23:14 crc kubenswrapper[4834]: E1008 22:23:14.467095 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.23:6443: connect: connection refused" logger="UnhandledError" Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.559758 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5ef8b10170defbb46c458d72f212d030e4d23b67dc695031d2cd88a6cd6079fe"} Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.561981 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ce238890770e17211d60a73495c464fc75bfa6742ecc2043bd7e558ef6a78c6c"} Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.563321 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7f8d95317246b9367ef0338fc53ae3faf028fa3fb1ee70c08635d8bb945ec0ad"} Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.564342 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6c8ce2d6083404a3faab9399a8846814566097bd33b68071a586c5dd21b73f44"} Oct 08 22:23:14 crc kubenswrapper[4834]: I1008 22:23:14.566805 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"005843575e18695dff6161a9eb9e5b43e7edca3ea4f1d0d2a35129cf40fd3177"} Oct 08 22:23:14 crc kubenswrapper[4834]: E1008 22:23:14.883382 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.23:6443: connect: connection refused" interval="1.6s" Oct 08 22:23:14 crc kubenswrapper[4834]: W1008 22:23:14.924106 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.23:6443: connect: connection refused Oct 08 22:23:14 crc kubenswrapper[4834]: E1008 22:23:14.924269 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.23:6443: connect: connection refused" logger="UnhandledError" Oct 08 22:23:15 crc kubenswrapper[4834]: W1008 22:23:15.023055 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.23:6443: connect: connection refused Oct 08 22:23:15 crc kubenswrapper[4834]: E1008 22:23:15.023229 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.23:6443: connect: connection refused" logger="UnhandledError" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.153542 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.155737 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.155819 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.155843 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.155902 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 22:23:15 crc kubenswrapper[4834]: E1008 22:23:15.156714 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.23:6443: connect: connection refused" node="crc" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.466522 4834 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.23:6443: connect: connection refused Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.573660 4834 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2" exitCode=0 Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.573795 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2"} Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.573817 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.575381 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.575426 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.575445 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.577018 4834 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8" exitCode=0 Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.577140 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8"} Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.577222 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.578415 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.578474 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.578494 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.582429 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588"} Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.582473 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207"} Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.582493 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6"} Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.586703 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2" exitCode=0 Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.586852 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.586909 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2"} Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.588299 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.588367 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.588393 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.590263 4834 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79" exitCode=0 Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.590332 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79"} Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.590455 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.591754 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.591821 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.591846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.594773 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.596723 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.596784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:15 crc kubenswrapper[4834]: I1008 22:23:15.596804 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.466294 4834 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.23:6443: connect: connection refused Oct 08 22:23:16 crc kubenswrapper[4834]: E1008 22:23:16.485879 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.23:6443: connect: connection refused" interval="3.2s" Oct 08 22:23:16 crc kubenswrapper[4834]: W1008 22:23:16.592180 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.23:6443: connect: connection refused Oct 08 22:23:16 crc kubenswrapper[4834]: E1008 22:23:16.592292 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.23:6443: connect: connection refused" logger="UnhandledError" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.596685 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"609506e03402de713e44a88255d61dc4d41513a9218e8e13a60a287db447bafb"} Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.596731 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.596739 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2f15a41148009702a77c601901941b8e079f0a12ddf642fa6cb762dc71be69bc"} Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.596861 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"eaded9d1a4ba5fc97b04ac0968d3abdc4f389864754064554e59a719869ca3ab"} Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.597737 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.597772 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.597781 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.601181 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088"} Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.601265 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.602256 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.602281 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.602289 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.604121 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45"} Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.604177 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814"} Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.604195 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09"} Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.606734 4834 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9" exitCode=0 Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.606812 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9"} Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.606860 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.607839 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.607864 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.607875 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.608704 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3648e303d2c1a4d2a6347de8d21cd9f6f9f2951173d15e272cff43b5602b0368"} Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.608765 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.609401 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.609428 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.609442 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:16 crc kubenswrapper[4834]: W1008 22:23:16.609771 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.23:6443: connect: connection refused Oct 08 22:23:16 crc kubenswrapper[4834]: E1008 22:23:16.609869 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.23:6443: connect: connection refused" logger="UnhandledError" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.757670 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.759274 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.759315 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.759329 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:16 crc kubenswrapper[4834]: I1008 22:23:16.759360 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 22:23:16 crc kubenswrapper[4834]: E1008 22:23:16.759849 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.23:6443: connect: connection refused" node="crc" Oct 08 22:23:17 crc kubenswrapper[4834]: E1008 22:23:17.059977 4834 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.23:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186ca44a4f55b0df default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-08 22:23:13.460572383 +0000 UTC m=+1.283457159,LastTimestamp:2025-10-08 22:23:13.460572383 +0000 UTC m=+1.283457159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.148116 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.307909 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.617749 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8"} Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.617812 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63"} Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.617911 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.619572 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.619623 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.619636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.621841 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.622263 4834 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014" exitCode=0 Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.622400 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.622420 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014"} Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.622406 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.622512 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.622453 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.623033 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.624994 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.625042 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.625061 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.625171 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.625202 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.625214 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.625898 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.625973 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.626001 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.626191 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.626259 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:17 crc kubenswrapper[4834]: I1008 22:23:17.626353 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:18 crc kubenswrapper[4834]: I1008 22:23:18.621775 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:18 crc kubenswrapper[4834]: I1008 22:23:18.634065 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6"} Oct 08 22:23:18 crc kubenswrapper[4834]: I1008 22:23:18.634125 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8"} Oct 08 22:23:18 crc kubenswrapper[4834]: I1008 22:23:18.634164 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1"} Oct 08 22:23:18 crc kubenswrapper[4834]: I1008 22:23:18.634246 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:18 crc kubenswrapper[4834]: I1008 22:23:18.634304 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:18 crc kubenswrapper[4834]: I1008 22:23:18.634512 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:18 crc kubenswrapper[4834]: I1008 22:23:18.636072 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:18 crc kubenswrapper[4834]: I1008 22:23:18.642213 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:18 crc kubenswrapper[4834]: I1008 22:23:18.642276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:18 crc kubenswrapper[4834]: I1008 22:23:18.642296 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:18 crc kubenswrapper[4834]: I1008 22:23:18.643742 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:18 crc kubenswrapper[4834]: I1008 22:23:18.643821 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:18 crc kubenswrapper[4834]: I1008 22:23:18.643844 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.644476 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f"} Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.644586 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92"} Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.644624 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.644728 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.645601 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.646097 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.646186 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.646207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.646449 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.646537 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.646564 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.647294 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.647341 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.647359 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.960552 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.962901 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.963186 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.963352 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:19 crc kubenswrapper[4834]: I1008 22:23:19.963523 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 22:23:20 crc kubenswrapper[4834]: I1008 22:23:20.308286 4834 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 08 22:23:20 crc kubenswrapper[4834]: I1008 22:23:20.308732 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 22:23:20 crc kubenswrapper[4834]: I1008 22:23:20.393230 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:20 crc kubenswrapper[4834]: I1008 22:23:20.647752 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:20 crc kubenswrapper[4834]: I1008 22:23:20.648667 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:20 crc kubenswrapper[4834]: I1008 22:23:20.649020 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:20 crc kubenswrapper[4834]: I1008 22:23:20.649088 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:20 crc kubenswrapper[4834]: I1008 22:23:20.649241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:20 crc kubenswrapper[4834]: I1008 22:23:20.650040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:20 crc kubenswrapper[4834]: I1008 22:23:20.650347 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:20 crc kubenswrapper[4834]: I1008 22:23:20.650532 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.484012 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.484511 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.486590 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.486655 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.486676 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.775766 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.776111 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.777779 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.777833 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.777851 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.783412 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.875715 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.876005 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.877756 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.877804 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:22 crc kubenswrapper[4834]: I1008 22:23:22.877821 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:23 crc kubenswrapper[4834]: E1008 22:23:23.656038 4834 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 08 22:23:23 crc kubenswrapper[4834]: I1008 22:23:23.657990 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:23 crc kubenswrapper[4834]: I1008 22:23:23.659597 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:23 crc kubenswrapper[4834]: I1008 22:23:23.659677 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:23 crc kubenswrapper[4834]: I1008 22:23:23.659711 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:27 crc kubenswrapper[4834]: I1008 22:23:27.110373 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 08 22:23:27 crc kubenswrapper[4834]: I1008 22:23:27.110480 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 08 22:23:27 crc kubenswrapper[4834]: I1008 22:23:27.156509 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:27 crc kubenswrapper[4834]: I1008 22:23:27.156692 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:27 crc kubenswrapper[4834]: I1008 22:23:27.158443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:27 crc kubenswrapper[4834]: I1008 22:23:27.158494 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:27 crc kubenswrapper[4834]: I1008 22:23:27.158513 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:27 crc kubenswrapper[4834]: W1008 22:23:27.274083 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 08 22:23:27 crc kubenswrapper[4834]: I1008 22:23:27.274278 4834 trace.go:236] Trace[1416472194]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 22:23:17.272) (total time: 10001ms): Oct 08 22:23:27 crc kubenswrapper[4834]: Trace[1416472194]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (22:23:27.274) Oct 08 22:23:27 crc kubenswrapper[4834]: Trace[1416472194]: [10.001929015s] [10.001929015s] END Oct 08 22:23:27 crc kubenswrapper[4834]: E1008 22:23:27.274315 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 08 22:23:27 crc kubenswrapper[4834]: W1008 22:23:27.423822 4834 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 08 22:23:27 crc kubenswrapper[4834]: I1008 22:23:27.423961 4834 trace.go:236] Trace[326940680]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 22:23:17.422) (total time: 10001ms): Oct 08 22:23:27 crc kubenswrapper[4834]: Trace[326940680]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (22:23:27.423) Oct 08 22:23:27 crc kubenswrapper[4834]: Trace[326940680]: [10.001824462s] [10.001824462s] END Oct 08 22:23:27 crc kubenswrapper[4834]: E1008 22:23:27.424000 4834 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 08 22:23:27 crc kubenswrapper[4834]: I1008 22:23:27.468315 4834 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 08 22:23:28 crc kubenswrapper[4834]: I1008 22:23:28.498390 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 08 22:23:28 crc kubenswrapper[4834]: I1008 22:23:28.498484 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 08 22:23:28 crc kubenswrapper[4834]: I1008 22:23:28.508654 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 08 22:23:28 crc kubenswrapper[4834]: I1008 22:23:28.508738 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 08 22:23:30 crc kubenswrapper[4834]: I1008 22:23:30.308038 4834 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 08 22:23:30 crc kubenswrapper[4834]: I1008 22:23:30.309472 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.235868 4834 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.492127 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.492430 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.494013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.494072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.494092 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.500696 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.691851 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.693441 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.693501 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.693521 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.829621 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.829899 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.831966 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.832029 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.832050 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:32 crc kubenswrapper[4834]: I1008 22:23:32.851514 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.108569 4834 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.460760 4834 apiserver.go:52] "Watching apiserver" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.468824 4834 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.469329 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.469885 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.469958 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.469908 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.470290 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.470380 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.470442 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.470477 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.470772 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.470883 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.471214 4834 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.474180 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.474274 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.474273 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.474475 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.474815 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.475022 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.475101 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.475366 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.477314 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.489063 4834 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.493688 4834 trace.go:236] Trace[176403524]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 22:23:19.824) (total time: 13668ms): Oct 08 22:23:33 crc kubenswrapper[4834]: Trace[176403524]: ---"Objects listed" error: 13668ms (22:23:33.493) Oct 08 22:23:33 crc kubenswrapper[4834]: Trace[176403524]: [13.668859315s] [13.668859315s] END Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.493725 4834 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.494805 4834 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.494814 4834 trace.go:236] Trace[1795827409]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Oct-2025 22:23:21.711) (total time: 11782ms): Oct 08 22:23:33 crc kubenswrapper[4834]: Trace[1795827409]: ---"Objects listed" error: 11782ms (22:23:33.494) Oct 08 22:23:33 crc kubenswrapper[4834]: Trace[1795827409]: [11.782841365s] [11.782841365s] END Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.494874 4834 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.508447 4834 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.525740 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.550332 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.571673 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.573671 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45312->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.573763 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:59900->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.573752 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:45312->192.168.126.11:17697: read: connection reset by peer" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.573859 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:59900->192.168.126.11:17697: read: connection reset by peer" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.574342 4834 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.574401 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.590282 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.595387 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.595453 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.595500 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.595543 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.595590 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.595629 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.595666 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.595708 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.595749 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.595792 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.595839 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.595936 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.596051 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.596513 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.596640 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.596659 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.596693 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.596821 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.596923 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.596981 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597023 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597061 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597086 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597106 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597113 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597199 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597258 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597305 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597341 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597379 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597416 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597453 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597486 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597505 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597526 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597565 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597586 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597598 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597687 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597737 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597776 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597819 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597860 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597898 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597934 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597970 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598012 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598049 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598089 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598126 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598196 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598233 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598282 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598320 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598359 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598396 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598435 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598471 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598506 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598540 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598577 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598612 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598648 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598683 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598717 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598753 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598786 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598823 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598862 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598896 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598929 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598963 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599002 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599036 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599076 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599121 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599614 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599670 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599712 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599752 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599792 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599965 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600012 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600051 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600184 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600225 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600271 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600308 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600346 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600381 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600422 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600463 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600501 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600536 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600578 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600632 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600671 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600706 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600743 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600780 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600821 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600859 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600895 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.600931 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601110 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601196 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601236 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601284 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601322 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601359 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601399 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601436 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601473 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601511 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601552 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601593 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601632 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601669 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601708 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601744 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601783 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601832 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601872 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601918 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601957 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602056 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602097 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602135 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602208 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602251 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602294 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602333 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602374 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602412 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602484 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602532 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602570 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602608 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602648 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602689 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602729 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602769 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602805 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602845 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602881 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602920 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602959 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603000 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603037 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603075 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603116 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603190 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603230 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603268 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603308 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603349 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603389 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603430 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603475 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603516 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603562 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603602 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603641 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603683 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603721 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603760 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603800 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603837 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603878 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603919 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603958 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.604000 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.604045 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.604106 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.604172 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597776 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597912 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.597964 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598064 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598332 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.598594 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599212 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599290 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599489 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599503 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599550 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599733 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.604410 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606207 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606254 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606305 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606347 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606387 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606426 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606468 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606506 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606544 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606583 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606624 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606667 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606707 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606746 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606789 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606831 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606874 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606917 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607004 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607050 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607091 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607130 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607204 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607244 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607286 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607324 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607365 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607406 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607452 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607501 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607573 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607628 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607671 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607713 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607758 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607801 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607846 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607889 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607931 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607972 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607955 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608021 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608066 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608135 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608211 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608332 4834 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608359 4834 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608385 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608454 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608480 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608539 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608564 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608587 4834 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608645 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608673 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608697 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608720 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608745 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608769 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608794 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608818 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608845 4834 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608871 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608894 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608919 4834 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608941 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608965 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608988 4834 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.626137 4834 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.599984 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601060 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601397 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601529 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601796 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.629216 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.601850 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602011 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602128 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602263 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602405 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602483 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.602908 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603305 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603728 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603988 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.604018 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.603996 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.604069 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.604309 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.604549 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.605346 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.605502 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606097 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606218 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606258 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606728 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.606771 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607331 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607826 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.607949 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608414 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608544 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608651 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.608730 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.609071 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.609480 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.609625 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.609973 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.610049 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.610579 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.610907 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.610974 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.610995 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.611312 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.611734 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.612304 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.612326 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.612431 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.613085 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.613678 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.614218 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.614227 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.614377 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:23:34.114355238 +0000 UTC m=+21.937240084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.614514 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.614548 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.614562 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.614587 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.614839 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.615623 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.615751 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.615840 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.615836 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.615881 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.615903 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.615945 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.616008 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.616274 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.616503 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.616405 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.616645 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.616766 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.616994 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.617178 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.617261 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.617404 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.617414 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.617453 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.617447 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.617491 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.617531 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.617606 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.617862 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.617904 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.618120 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.618217 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.618250 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.618941 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.619182 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.619656 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.619927 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.620012 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.620518 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.620606 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.620672 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.620745 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.620849 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.621068 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.622109 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.622133 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.622473 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.623832 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.624221 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.624436 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.624723 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.624903 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.625630 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.625772 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.626513 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.626739 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.627092 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.627179 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.627336 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.627405 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.627842 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.627883 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.627906 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.628010 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.628382 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.628506 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.629010 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.629141 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.630122 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.630369 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.630667 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.630731 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.631031 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.631347 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:34.131306279 +0000 UTC m=+21.954191105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.631635 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:34.131613025 +0000 UTC m=+21.954497901 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.632304 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.632671 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.632772 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.633413 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.636544 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.637515 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.637769 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.637976 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.638034 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.638556 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.642611 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.642837 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.642987 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.647494 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.650832 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.655043 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.655047 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.655434 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.655687 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.655700 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.655910 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.655981 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.656024 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.656247 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.656405 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.656865 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.657973 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.658361 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.658530 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.658557 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.658666 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.659488 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.659561 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.659726 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.659746 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.659853 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:34.159823358 +0000 UTC m=+21.982708114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.661394 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.661715 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.661774 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.662211 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.662385 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.662629 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.662827 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.663210 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.663513 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.663560 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.663623 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.664114 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.664714 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.665295 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.665672 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.666794 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.666898 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.668333 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.672488 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.672731 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.681551 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.681789 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.684394 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.689838 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.698860 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.699623 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.709442 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.709653 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.709721 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.709813 4834 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.709944 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.709986 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710013 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710031 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710044 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710060 4834 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710073 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710087 4834 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710101 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710114 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710129 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710173 4834 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710236 4834 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710261 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710280 4834 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710297 4834 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710315 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710330 4834 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710345 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710359 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710374 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710390 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710406 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710423 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710439 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710457 4834 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710472 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710488 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710502 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710516 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710564 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710580 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710616 4834 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710699 4834 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710716 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710730 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710744 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710757 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710797 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710813 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710828 4834 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710890 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710906 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710922 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710938 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710953 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710968 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710984 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.710999 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711016 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711034 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711050 4834 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711069 4834 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711084 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711099 4834 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711116 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711177 4834 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711192 4834 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711208 4834 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711222 4834 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711272 4834 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711308 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711324 4834 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711338 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711378 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711414 4834 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711430 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711506 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711655 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711675 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711690 4834 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711706 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711721 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711735 4834 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711749 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711763 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711778 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711792 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711805 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711820 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711835 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711849 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711863 4834 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711877 4834 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711893 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711911 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711926 4834 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711942 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711955 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711968 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711982 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.711998 4834 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712013 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712029 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712046 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712061 4834 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712076 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712090 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712104 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712123 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712163 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712177 4834 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712192 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712206 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712221 4834 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712236 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712267 4834 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712281 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712295 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712310 4834 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712325 4834 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712341 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712357 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712371 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712384 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712398 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712413 4834 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712428 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712443 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712457 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712470 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712484 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712498 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712510 4834 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712525 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712539 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712553 4834 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712568 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712581 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712596 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712609 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712622 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712637 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712650 4834 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712663 4834 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712676 4834 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712690 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712706 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712720 4834 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712735 4834 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712748 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712766 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712779 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712792 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712807 4834 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712822 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712835 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712850 4834 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712863 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712876 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712891 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712905 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712918 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712930 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712942 4834 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712955 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712970 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712982 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.712995 4834 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.713007 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.713020 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.713032 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.713048 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.713066 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.713082 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.713096 4834 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.713109 4834 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.713122 4834 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.709954 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.713136 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.713167 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.718132 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.746529 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.747573 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.766612 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.766666 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.766685 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:33 crc kubenswrapper[4834]: E1008 22:23:33.766767 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:34.266739909 +0000 UTC m=+22.089624665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.776320 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.795192 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.801517 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.813810 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.826017 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 08 22:23:33 crc kubenswrapper[4834]: W1008 22:23:33.839317 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-7247b8a46f25d9c540934267a4a9d60010b3d0ca50921b5e84f8efa80d1e20f3 WatchSource:0}: Error finding container 7247b8a46f25d9c540934267a4a9d60010b3d0ca50921b5e84f8efa80d1e20f3: Status 404 returned error can't find the container with id 7247b8a46f25d9c540934267a4a9d60010b3d0ca50921b5e84f8efa80d1e20f3 Oct 08 22:23:33 crc kubenswrapper[4834]: I1008 22:23:33.842899 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 08 22:23:33 crc kubenswrapper[4834]: W1008 22:23:33.866455 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-60ea78a6698caefe0afe80ba6fe1dd003cfd77955ec140c27d73447672c86d49 WatchSource:0}: Error finding container 60ea78a6698caefe0afe80ba6fe1dd003cfd77955ec140c27d73447672c86d49: Status 404 returned error can't find the container with id 60ea78a6698caefe0afe80ba6fe1dd003cfd77955ec140c27d73447672c86d49 Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.116861 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:23:34 crc kubenswrapper[4834]: E1008 22:23:34.117200 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:23:35.117083604 +0000 UTC m=+22.939968390 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.218311 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.218368 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.218416 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:34 crc kubenswrapper[4834]: E1008 22:23:34.218447 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:23:34 crc kubenswrapper[4834]: E1008 22:23:34.218548 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:35.218524371 +0000 UTC m=+23.041409147 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:23:34 crc kubenswrapper[4834]: E1008 22:23:34.218566 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:23:34 crc kubenswrapper[4834]: E1008 22:23:34.218657 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:35.218609694 +0000 UTC m=+23.041494480 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:23:34 crc kubenswrapper[4834]: E1008 22:23:34.218799 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:23:34 crc kubenswrapper[4834]: E1008 22:23:34.218900 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:23:34 crc kubenswrapper[4834]: E1008 22:23:34.218954 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:34 crc kubenswrapper[4834]: E1008 22:23:34.219126 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:35.219069464 +0000 UTC m=+23.041954250 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.319819 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:34 crc kubenswrapper[4834]: E1008 22:23:34.320119 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:23:34 crc kubenswrapper[4834]: E1008 22:23:34.320196 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:23:34 crc kubenswrapper[4834]: E1008 22:23:34.320216 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:34 crc kubenswrapper[4834]: E1008 22:23:34.320505 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:35.32047224 +0000 UTC m=+23.143357006 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.554738 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:34 crc kubenswrapper[4834]: E1008 22:23:34.554925 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.699098 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.702537 4834 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8" exitCode=255 Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.702646 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8"} Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.706264 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e"} Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.706323 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873"} Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.706353 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"60ea78a6698caefe0afe80ba6fe1dd003cfd77955ec140c27d73447672c86d49"} Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.707975 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7247b8a46f25d9c540934267a4a9d60010b3d0ca50921b5e84f8efa80d1e20f3"} Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.710867 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade"} Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.711040 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ac19544a5d1cf021ba362bd279dac4ad0df55923dc901cd43b15c04cfe249f4c"} Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.720314 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.721314 4834 scope.go:117] "RemoveContainer" containerID="5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.721382 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.744032 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.757694 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.803886 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.821592 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.837358 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.853931 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.876616 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.892647 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.907762 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.923542 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.937710 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.952717 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.970184 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:34 crc kubenswrapper[4834]: I1008 22:23:34.982894 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.127336 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.127627 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:23:37.127579049 +0000 UTC m=+24.950463835 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.228768 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.228841 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.228870 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.229031 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.229051 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.229044 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.229107 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.229186 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:37.229166279 +0000 UTC m=+25.052051025 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.229361 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:37.229326192 +0000 UTC m=+25.052210968 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.229065 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.229430 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:37.229417084 +0000 UTC m=+25.052301860 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.330311 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.330517 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.330562 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.330590 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.330660 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:37.330639357 +0000 UTC m=+25.153524163 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.554504 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.554703 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.554504 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:35 crc kubenswrapper[4834]: E1008 22:23:35.554834 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.561665 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.562534 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.564793 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.566713 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.568926 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.570005 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.571371 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.573468 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.574821 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.577330 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.578457 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.581090 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.582176 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.583395 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.585488 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.586746 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.589801 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.591656 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.592892 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.595117 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.596056 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.597307 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.599114 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.600472 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.602348 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.603652 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.605747 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.606728 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.608803 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.609773 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.610825 4834 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.611052 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.615320 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.616315 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.618028 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.621493 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.622884 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.625427 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.627537 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.629731 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.630768 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.632978 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.634525 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.636537 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.637173 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.638656 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.639817 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.641420 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.642068 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.643426 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.644063 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.644839 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.646103 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.646747 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.719288 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.722674 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890"} Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.766071 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.783221 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.802415 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.821447 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.844737 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.869747 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.909341 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:35 crc kubenswrapper[4834]: I1008 22:23:35.929056 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:36 crc kubenswrapper[4834]: I1008 22:23:36.555496 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:36 crc kubenswrapper[4834]: E1008 22:23:36.555678 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:23:36 crc kubenswrapper[4834]: I1008 22:23:36.725932 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.147728 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.147985 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:23:41.147955754 +0000 UTC m=+28.970840530 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.248815 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.248937 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.248976 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.249104 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.249198 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.249204 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.249308 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.249331 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.249273 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:41.249241758 +0000 UTC m=+29.072126534 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.249546 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:41.249462593 +0000 UTC m=+29.072347379 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.249625 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:41.249603796 +0000 UTC m=+29.072488682 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.328894 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.335474 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.344115 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.350344 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.350623 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.350674 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.350696 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.350790 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:41.350761427 +0000 UTC m=+29.173646203 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.359552 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.385569 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.408168 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.427606 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.458580 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.480967 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.502990 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.522014 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.555316 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.555493 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.555595 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.555975 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.562434 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.585540 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.607348 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.629572 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.654900 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.681553 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.703498 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.723560 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.731941 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807"} Oct 08 22:23:37 crc kubenswrapper[4834]: E1008 22:23:37.748091 4834 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.749212 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.769190 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.790561 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.812292 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.846284 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.867322 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.887937 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.905924 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.926865 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:37 crc kubenswrapper[4834]: I1008 22:23:37.943815 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:38 crc kubenswrapper[4834]: I1008 22:23:38.554815 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:38 crc kubenswrapper[4834]: E1008 22:23:38.555016 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.393031 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-kh4dw"] Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.393494 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kh4dw" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.395586 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.395725 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.396577 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.422540 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.440910 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.455195 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.469389 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.470630 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg9t6\" (UniqueName: \"kubernetes.io/projected/08f0a28c-1bc4-4302-bc15-b2d975ef0b47-kube-api-access-wg9t6\") pod \"node-resolver-kh4dw\" (UID: \"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\") " pod="openshift-dns/node-resolver-kh4dw" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.470796 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/08f0a28c-1bc4-4302-bc15-b2d975ef0b47-hosts-file\") pod \"node-resolver-kh4dw\" (UID: \"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\") " pod="openshift-dns/node-resolver-kh4dw" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.483243 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.498198 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.514086 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.527294 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.542945 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.554915 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:39 crc kubenswrapper[4834]: E1008 22:23:39.555115 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.555348 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:39 crc kubenswrapper[4834]: E1008 22:23:39.555588 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.560415 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.571950 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/08f0a28c-1bc4-4302-bc15-b2d975ef0b47-hosts-file\") pod \"node-resolver-kh4dw\" (UID: \"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\") " pod="openshift-dns/node-resolver-kh4dw" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.572199 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/08f0a28c-1bc4-4302-bc15-b2d975ef0b47-hosts-file\") pod \"node-resolver-kh4dw\" (UID: \"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\") " pod="openshift-dns/node-resolver-kh4dw" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.572229 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg9t6\" (UniqueName: \"kubernetes.io/projected/08f0a28c-1bc4-4302-bc15-b2d975ef0b47-kube-api-access-wg9t6\") pod \"node-resolver-kh4dw\" (UID: \"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\") " pod="openshift-dns/node-resolver-kh4dw" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.592993 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg9t6\" (UniqueName: \"kubernetes.io/projected/08f0a28c-1bc4-4302-bc15-b2d975ef0b47-kube-api-access-wg9t6\") pod \"node-resolver-kh4dw\" (UID: \"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\") " pod="openshift-dns/node-resolver-kh4dw" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.705312 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kh4dw" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.744497 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kh4dw" event={"ID":"08f0a28c-1bc4-4302-bc15-b2d975ef0b47","Type":"ContainerStarted","Data":"30f629124df55fef85d9c443270e64e7388a15fec1e785f4179bc13f918fc5f9"} Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.797180 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-f297z"] Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.797591 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.800430 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wrrs9"] Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.806881 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.810821 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.811246 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.811467 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.822500 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xsqwx"] Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.823202 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.823415 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.823670 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.823891 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.824092 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.824291 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.824342 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-f9m4z"] Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.824849 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.825015 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.825120 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.825383 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.825591 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.830809 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.831078 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.831217 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.831325 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.831271 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.831550 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.831730 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.853446 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875017 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-multus-conf-dir\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875067 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-run-ovn-kubernetes\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875091 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/732cf917-b3ec-4649-99b0-66653902cfc2-mcd-auth-proxy-config\") pod \"machine-config-daemon-f9m4z\" (UID: \"732cf917-b3ec-4649-99b0-66653902cfc2\") " pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875124 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b150123b-551e-4c12-afa1-0c651719d3f2-multus-daemon-config\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875178 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnh95\" (UniqueName: \"kubernetes.io/projected/b150123b-551e-4c12-afa1-0c651719d3f2-kube-api-access-jnh95\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875208 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-slash\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875240 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-var-lib-openvswitch\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875266 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-ovn\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875291 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovnkube-config\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875317 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-run-netns\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875345 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-kubelet\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875373 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0ae96bc9-69e0-4f09-b59b-92157a2d5948-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875392 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b150123b-551e-4c12-afa1-0c651719d3f2-cni-binary-copy\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875413 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-var-lib-cni-bin\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875433 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-etc-openvswitch\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875473 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-openvswitch\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875499 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-run-k8s-cni-cncf-io\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875520 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-var-lib-kubelet\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875590 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0ae96bc9-69e0-4f09-b59b-92157a2d5948-os-release\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875670 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-multus-socket-dir-parent\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875698 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-run-multus-certs\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875717 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw97q\" (UniqueName: \"kubernetes.io/projected/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-kube-api-access-fw97q\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875747 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkrrl\" (UniqueName: \"kubernetes.io/projected/732cf917-b3ec-4649-99b0-66653902cfc2-kube-api-access-hkrrl\") pod \"machine-config-daemon-f9m4z\" (UID: \"732cf917-b3ec-4649-99b0-66653902cfc2\") " pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875766 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/732cf917-b3ec-4649-99b0-66653902cfc2-rootfs\") pod \"machine-config-daemon-f9m4z\" (UID: \"732cf917-b3ec-4649-99b0-66653902cfc2\") " pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875785 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-system-cni-dir\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875816 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-cnibin\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875838 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-os-release\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875861 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-cni-bin\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875879 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-env-overrides\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875916 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-multus-cni-dir\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875951 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-var-lib-cni-multus\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875969 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-run-netns\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.875987 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0ae96bc9-69e0-4f09-b59b-92157a2d5948-cni-binary-copy\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876008 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-systemd\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876028 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/732cf917-b3ec-4649-99b0-66653902cfc2-proxy-tls\") pod \"machine-config-daemon-f9m4z\" (UID: \"732cf917-b3ec-4649-99b0-66653902cfc2\") " pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876048 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovnkube-script-lib\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876068 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0ae96bc9-69e0-4f09-b59b-92157a2d5948-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876131 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0ae96bc9-69e0-4f09-b59b-92157a2d5948-cnibin\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876163 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-hostroot\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876181 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-etc-kubernetes\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876207 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqcx9\" (UniqueName: \"kubernetes.io/projected/0ae96bc9-69e0-4f09-b59b-92157a2d5948-kube-api-access-fqcx9\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876232 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-node-log\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876255 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-cni-netd\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876273 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876294 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-log-socket\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876312 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovn-node-metrics-cert\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876416 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0ae96bc9-69e0-4f09-b59b-92157a2d5948-system-cni-dir\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.876436 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-systemd-units\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.909701 4834 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.911864 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.915337 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.915383 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.915398 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.915769 4834 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.926882 4834 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.927575 4834 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.928891 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.928948 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.928961 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.928985 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.928999 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:39Z","lastTransitionTime":"2025-10-08T22:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.934474 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: E1008 22:23:39.953930 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.959698 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.959745 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.959757 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.959777 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.959790 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:39Z","lastTransitionTime":"2025-10-08T22:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.972870 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: E1008 22:23:39.976126 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:39Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.976880 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0ae96bc9-69e0-4f09-b59b-92157a2d5948-cni-binary-copy\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.976974 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-systemd\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.977049 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/732cf917-b3ec-4649-99b0-66653902cfc2-proxy-tls\") pod \"machine-config-daemon-f9m4z\" (UID: \"732cf917-b3ec-4649-99b0-66653902cfc2\") " pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.977131 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovnkube-script-lib\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.977224 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0ae96bc9-69e0-4f09-b59b-92157a2d5948-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.977297 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0ae96bc9-69e0-4f09-b59b-92157a2d5948-cnibin\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.977392 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-hostroot\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.977543 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-etc-kubernetes\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.977627 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqcx9\" (UniqueName: \"kubernetes.io/projected/0ae96bc9-69e0-4f09-b59b-92157a2d5948-kube-api-access-fqcx9\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.977698 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-node-log\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.977771 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-cni-netd\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.977848 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.977926 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0ae96bc9-69e0-4f09-b59b-92157a2d5948-cnibin\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.977927 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0ae96bc9-69e0-4f09-b59b-92157a2d5948-system-cni-dir\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978008 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-systemd-units\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978027 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-log-socket\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978045 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovn-node-metrics-cert\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978064 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-multus-conf-dir\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978089 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-run-ovn-kubernetes\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978110 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/732cf917-b3ec-4649-99b0-66653902cfc2-mcd-auth-proxy-config\") pod \"machine-config-daemon-f9m4z\" (UID: \"732cf917-b3ec-4649-99b0-66653902cfc2\") " pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978138 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b150123b-551e-4c12-afa1-0c651719d3f2-multus-daemon-config\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978183 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnh95\" (UniqueName: \"kubernetes.io/projected/b150123b-551e-4c12-afa1-0c651719d3f2-kube-api-access-jnh95\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978205 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-slash\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978222 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-var-lib-openvswitch\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978249 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-ovn\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978275 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovnkube-config\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978301 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-run-netns\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978330 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0ae96bc9-69e0-4f09-b59b-92157a2d5948-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978354 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b150123b-551e-4c12-afa1-0c651719d3f2-cni-binary-copy\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978380 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-var-lib-cni-bin\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978402 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-kubelet\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978431 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-run-k8s-cni-cncf-io\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978456 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-var-lib-kubelet\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978480 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-etc-openvswitch\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978503 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-openvswitch\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978523 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0ae96bc9-69e0-4f09-b59b-92157a2d5948-os-release\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978547 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-multus-socket-dir-parent\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978574 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-run-multus-certs\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978597 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw97q\" (UniqueName: \"kubernetes.io/projected/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-kube-api-access-fw97q\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978619 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkrrl\" (UniqueName: \"kubernetes.io/projected/732cf917-b3ec-4649-99b0-66653902cfc2-kube-api-access-hkrrl\") pod \"machine-config-daemon-f9m4z\" (UID: \"732cf917-b3ec-4649-99b0-66653902cfc2\") " pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978649 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/732cf917-b3ec-4649-99b0-66653902cfc2-rootfs\") pod \"machine-config-daemon-f9m4z\" (UID: \"732cf917-b3ec-4649-99b0-66653902cfc2\") " pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978678 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-system-cni-dir\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978704 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-cnibin\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978725 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-os-release\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978748 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-cni-bin\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978751 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-node-log\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978800 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-multus-cni-dir\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978751 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-etc-kubernetes\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978800 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-cni-netd\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978855 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978930 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-run-k8s-cni-cncf-io\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978937 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-cnibin\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.979008 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-run-multus-certs\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.978997 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-etc-openvswitch\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.979052 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-openvswitch\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.979006 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-multus-socket-dir-parent\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.977882 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0ae96bc9-69e0-4f09-b59b-92157a2d5948-cni-binary-copy\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.979157 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-var-lib-kubelet\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.979173 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0ae96bc9-69e0-4f09-b59b-92157a2d5948-os-release\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.979208 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/732cf917-b3ec-4649-99b0-66653902cfc2-rootfs\") pod \"machine-config-daemon-f9m4z\" (UID: \"732cf917-b3ec-4649-99b0-66653902cfc2\") " pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.979245 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-var-lib-cni-bin\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.979254 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-os-release\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.979244 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-multus-cni-dir\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.979295 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-cni-bin\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.979376 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-run-ovn-kubernetes\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.979383 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-system-cni-dir\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.979397 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-systemd\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.979426 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-slash\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.980109 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0ae96bc9-69e0-4f09-b59b-92157a2d5948-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.980125 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-hostroot\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.980240 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-var-lib-cni-multus\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.980273 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b150123b-551e-4c12-afa1-0c651719d3f2-cni-binary-copy\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.980496 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-kubelet\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.980650 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-multus-conf-dir\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.980676 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-ovn\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.980706 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-var-lib-openvswitch\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.981205 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/732cf917-b3ec-4649-99b0-66653902cfc2-mcd-auth-proxy-config\") pod \"machine-config-daemon-f9m4z\" (UID: \"732cf917-b3ec-4649-99b0-66653902cfc2\") " pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.981273 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-run-netns\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.981301 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-env-overrides\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.981516 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0ae96bc9-69e0-4f09-b59b-92157a2d5948-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.981660 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0ae96bc9-69e0-4f09-b59b-92157a2d5948-system-cni-dir\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.981736 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-run-netns\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.981785 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-log-socket\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.981840 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-systemd-units\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.981932 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-env-overrides\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.981939 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovnkube-script-lib\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.981979 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b150123b-551e-4c12-afa1-0c651719d3f2-host-var-lib-cni-multus\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.982055 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-run-netns\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.983625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.983665 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.983677 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.983705 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.983722 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:39Z","lastTransitionTime":"2025-10-08T22:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.984207 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovn-node-metrics-cert\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.987753 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/732cf917-b3ec-4649-99b0-66653902cfc2-proxy-tls\") pod \"machine-config-daemon-f9m4z\" (UID: \"732cf917-b3ec-4649-99b0-66653902cfc2\") " pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.990844 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovnkube-config\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:39 crc kubenswrapper[4834]: I1008 22:23:39.994552 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b150123b-551e-4c12-afa1-0c651719d3f2-multus-daemon-config\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:40 crc kubenswrapper[4834]: E1008 22:23:40.011739 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.017851 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqcx9\" (UniqueName: \"kubernetes.io/projected/0ae96bc9-69e0-4f09-b59b-92157a2d5948-kube-api-access-fqcx9\") pod \"multus-additional-cni-plugins-xsqwx\" (UID: \"0ae96bc9-69e0-4f09-b59b-92157a2d5948\") " pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.018389 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw97q\" (UniqueName: \"kubernetes.io/projected/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-kube-api-access-fw97q\") pod \"ovnkube-node-wrrs9\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.018737 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.018840 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.019014 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.019113 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.019216 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:40Z","lastTransitionTime":"2025-10-08T22:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.018848 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnh95\" (UniqueName: \"kubernetes.io/projected/b150123b-551e-4c12-afa1-0c651719d3f2-kube-api-access-jnh95\") pod \"multus-f297z\" (UID: \"b150123b-551e-4c12-afa1-0c651719d3f2\") " pod="openshift-multus/multus-f297z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.022766 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkrrl\" (UniqueName: \"kubernetes.io/projected/732cf917-b3ec-4649-99b0-66653902cfc2-kube-api-access-hkrrl\") pod \"machine-config-daemon-f9m4z\" (UID: \"732cf917-b3ec-4649-99b0-66653902cfc2\") " pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.024333 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: E1008 22:23:40.033422 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.039960 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.040006 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.040019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.040041 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.040054 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:40Z","lastTransitionTime":"2025-10-08T22:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.041461 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.054689 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: E1008 22:23:40.055138 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: E1008 22:23:40.055300 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.056981 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.057038 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.057050 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.057071 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.057083 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:40Z","lastTransitionTime":"2025-10-08T22:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.070466 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.083572 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.098886 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.111986 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.132449 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f297z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.143163 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.146387 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:40 crc kubenswrapper[4834]: W1008 22:23:40.147511 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb150123b_551e_4c12_afa1_0c651719d3f2.slice/crio-b8c931c82380f852a7d9911ae403485b5e9346e98a943b0481d127b4a88d75a8 WatchSource:0}: Error finding container b8c931c82380f852a7d9911ae403485b5e9346e98a943b0481d127b4a88d75a8: Status 404 returned error can't find the container with id b8c931c82380f852a7d9911ae403485b5e9346e98a943b0481d127b4a88d75a8 Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.156604 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.161570 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.161631 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.161646 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.161669 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.161682 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:40Z","lastTransitionTime":"2025-10-08T22:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.170014 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.173715 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.195504 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: W1008 22:23:40.215383 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod732cf917_b3ec_4649_99b0_66653902cfc2.slice/crio-64b385554ab08a782c948574db83f2e9802eebd84c88479aa3b2668df1f80c99 WatchSource:0}: Error finding container 64b385554ab08a782c948574db83f2e9802eebd84c88479aa3b2668df1f80c99: Status 404 returned error can't find the container with id 64b385554ab08a782c948574db83f2e9802eebd84c88479aa3b2668df1f80c99 Oct 08 22:23:40 crc kubenswrapper[4834]: W1008 22:23:40.216533 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ae96bc9_69e0_4f09_b59b_92157a2d5948.slice/crio-3a75536bc7ba667925efec55503da5c6a7450c21d6abf1ad5aa50907e2beaaa2 WatchSource:0}: Error finding container 3a75536bc7ba667925efec55503da5c6a7450c21d6abf1ad5aa50907e2beaaa2: Status 404 returned error can't find the container with id 3a75536bc7ba667925efec55503da5c6a7450c21d6abf1ad5aa50907e2beaaa2 Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.230471 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.247790 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.266594 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.266646 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.266659 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.266685 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.266702 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:40Z","lastTransitionTime":"2025-10-08T22:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.268342 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.280731 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.299925 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.319478 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.333265 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.347222 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.368109 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.373962 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.374019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.374037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.374059 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.374071 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:40Z","lastTransitionTime":"2025-10-08T22:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.385440 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.399224 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.477113 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.477626 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.477640 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.477663 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.477677 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:40Z","lastTransitionTime":"2025-10-08T22:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.555055 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:40 crc kubenswrapper[4834]: E1008 22:23:40.555228 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.580980 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.581058 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.581071 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.581093 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.581106 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:40Z","lastTransitionTime":"2025-10-08T22:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.684224 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.684279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.684291 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.684309 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.684320 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:40Z","lastTransitionTime":"2025-10-08T22:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.749988 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f297z" event={"ID":"b150123b-551e-4c12-afa1-0c651719d3f2","Type":"ContainerStarted","Data":"3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.750058 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f297z" event={"ID":"b150123b-551e-4c12-afa1-0c651719d3f2","Type":"ContainerStarted","Data":"b8c931c82380f852a7d9911ae403485b5e9346e98a943b0481d127b4a88d75a8"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.751723 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kh4dw" event={"ID":"08f0a28c-1bc4-4302-bc15-b2d975ef0b47","Type":"ContainerStarted","Data":"08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.754446 4834 generic.go:334] "Generic (PLEG): container finished" podID="0ae96bc9-69e0-4f09-b59b-92157a2d5948" containerID="d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553" exitCode=0 Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.754516 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" event={"ID":"0ae96bc9-69e0-4f09-b59b-92157a2d5948","Type":"ContainerDied","Data":"d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.754540 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" event={"ID":"0ae96bc9-69e0-4f09-b59b-92157a2d5948","Type":"ContainerStarted","Data":"3a75536bc7ba667925efec55503da5c6a7450c21d6abf1ad5aa50907e2beaaa2"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.757205 4834 generic.go:334] "Generic (PLEG): container finished" podID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerID="f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1" exitCode=0 Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.757252 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerDied","Data":"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.757310 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerStarted","Data":"2a42fb3fdc38aeee5025bdc655cb04ffcb33ee59bf21601262bc514d75501646"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.759668 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.759723 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.759739 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"64b385554ab08a782c948574db83f2e9802eebd84c88479aa3b2668df1f80c99"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.798115 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.798264 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.798299 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.798315 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.798337 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.798352 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:40Z","lastTransitionTime":"2025-10-08T22:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.820604 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.849863 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.875087 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.888613 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.901540 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.901603 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.901618 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.901643 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.901545 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.901664 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:40Z","lastTransitionTime":"2025-10-08T22:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.922235 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.934354 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.948534 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.966509 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.980459 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:40 crc kubenswrapper[4834]: I1008 22:23:40.994094 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:40Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.004535 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.004593 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.004606 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.004628 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.004642 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:41Z","lastTransitionTime":"2025-10-08T22:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.011725 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.034222 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.047751 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.062978 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.074649 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.096285 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.107157 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.107266 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.107359 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.107433 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.107508 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:41Z","lastTransitionTime":"2025-10-08T22:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.109203 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.123471 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.141108 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.167298 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.183708 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.193820 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.194117 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:23:49.194088596 +0000 UTC m=+37.016973342 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.201504 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.209632 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.209768 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.209827 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.209916 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.209987 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:41Z","lastTransitionTime":"2025-10-08T22:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.225339 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.240051 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.261994 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.275743 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.294527 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.294608 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.294637 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.294794 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.294814 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.294828 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.294817 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.294885 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:49.294863988 +0000 UTC m=+37.117748754 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.294934 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.294956 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:49.29491806 +0000 UTC m=+37.117802826 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.294981 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:49.294971431 +0000 UTC m=+37.117856197 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.313413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.313474 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.313489 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.313515 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.313532 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:41Z","lastTransitionTime":"2025-10-08T22:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.395694 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.395982 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.396037 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.396053 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.396175 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:49.396116671 +0000 UTC m=+37.219001587 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.417136 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.418064 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.418201 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.418308 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.418388 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:41Z","lastTransitionTime":"2025-10-08T22:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.521931 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.521992 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.522008 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.522029 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.522041 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:41Z","lastTransitionTime":"2025-10-08T22:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.554526 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.554526 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.555152 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:23:41 crc kubenswrapper[4834]: E1008 22:23:41.555119 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.624720 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.624753 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.624763 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.624779 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.624789 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:41Z","lastTransitionTime":"2025-10-08T22:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.727632 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.727667 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.727680 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.727699 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.727712 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:41Z","lastTransitionTime":"2025-10-08T22:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.766995 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerStarted","Data":"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.767067 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerStarted","Data":"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.767095 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerStarted","Data":"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.767122 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerStarted","Data":"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.767201 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerStarted","Data":"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.767230 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerStarted","Data":"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.769237 4834 generic.go:334] "Generic (PLEG): container finished" podID="0ae96bc9-69e0-4f09-b59b-92157a2d5948" containerID="9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6" exitCode=0 Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.769347 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" event={"ID":"0ae96bc9-69e0-4f09-b59b-92157a2d5948","Type":"ContainerDied","Data":"9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.810391 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.828657 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.832693 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.832733 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.832751 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.832774 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.832787 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:41Z","lastTransitionTime":"2025-10-08T22:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.843760 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.860262 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.877541 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.901892 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.974064 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.990427 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.990488 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.990504 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.990555 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.990569 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:41Z","lastTransitionTime":"2025-10-08T22:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:41 crc kubenswrapper[4834]: I1008 22:23:41.994849 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.008684 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.021175 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.034460 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.053038 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.064646 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.078461 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.094289 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.094338 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.094347 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.094367 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.094379 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:42Z","lastTransitionTime":"2025-10-08T22:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.197908 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.197956 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.197969 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.197989 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.198006 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:42Z","lastTransitionTime":"2025-10-08T22:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.301099 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.301164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.301174 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.301192 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.301204 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:42Z","lastTransitionTime":"2025-10-08T22:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.403968 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.404022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.404036 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.404054 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.404065 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:42Z","lastTransitionTime":"2025-10-08T22:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.467470 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-wxzpv"] Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.467917 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wxzpv" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.469778 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.470709 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.471008 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.471234 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.490085 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.507528 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.507580 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.507599 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.507625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.507645 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:42Z","lastTransitionTime":"2025-10-08T22:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.508497 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/604cbcf3-0c65-49d1-b7af-4ac41fba5bab-host\") pod \"node-ca-wxzpv\" (UID: \"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\") " pod="openshift-image-registry/node-ca-wxzpv" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.508593 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/604cbcf3-0c65-49d1-b7af-4ac41fba5bab-serviceca\") pod \"node-ca-wxzpv\" (UID: \"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\") " pod="openshift-image-registry/node-ca-wxzpv" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.508657 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5zpz\" (UniqueName: \"kubernetes.io/projected/604cbcf3-0c65-49d1-b7af-4ac41fba5bab-kube-api-access-x5zpz\") pod \"node-ca-wxzpv\" (UID: \"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\") " pod="openshift-image-registry/node-ca-wxzpv" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.518934 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.555291 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:42 crc kubenswrapper[4834]: E1008 22:23:42.555775 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.566558 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.591597 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.610337 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/604cbcf3-0c65-49d1-b7af-4ac41fba5bab-host\") pod \"node-ca-wxzpv\" (UID: \"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\") " pod="openshift-image-registry/node-ca-wxzpv" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.610423 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/604cbcf3-0c65-49d1-b7af-4ac41fba5bab-serviceca\") pod \"node-ca-wxzpv\" (UID: \"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\") " pod="openshift-image-registry/node-ca-wxzpv" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.610460 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5zpz\" (UniqueName: \"kubernetes.io/projected/604cbcf3-0c65-49d1-b7af-4ac41fba5bab-kube-api-access-x5zpz\") pod \"node-ca-wxzpv\" (UID: \"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\") " pod="openshift-image-registry/node-ca-wxzpv" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.610488 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/604cbcf3-0c65-49d1-b7af-4ac41fba5bab-host\") pod \"node-ca-wxzpv\" (UID: \"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\") " pod="openshift-image-registry/node-ca-wxzpv" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.610577 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.610613 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.610625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.610643 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.610663 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:42Z","lastTransitionTime":"2025-10-08T22:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.612983 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/604cbcf3-0c65-49d1-b7af-4ac41fba5bab-serviceca\") pod \"node-ca-wxzpv\" (UID: \"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\") " pod="openshift-image-registry/node-ca-wxzpv" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.625285 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.638470 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5zpz\" (UniqueName: \"kubernetes.io/projected/604cbcf3-0c65-49d1-b7af-4ac41fba5bab-kube-api-access-x5zpz\") pod \"node-ca-wxzpv\" (UID: \"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\") " pod="openshift-image-registry/node-ca-wxzpv" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.638741 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.651750 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.663702 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.678604 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.699105 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.712962 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.713027 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.713045 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.713072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.713091 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:42Z","lastTransitionTime":"2025-10-08T22:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.713580 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.727902 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.748217 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.766176 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.776368 4834 generic.go:334] "Generic (PLEG): container finished" podID="0ae96bc9-69e0-4f09-b59b-92157a2d5948" containerID="a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb" exitCode=0 Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.776428 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" event={"ID":"0ae96bc9-69e0-4f09-b59b-92157a2d5948","Type":"ContainerDied","Data":"a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb"} Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.788336 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wxzpv" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.793869 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.813314 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.816601 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.816646 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.816656 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.816674 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.816684 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:42Z","lastTransitionTime":"2025-10-08T22:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.833056 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.850259 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.879431 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.900127 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.919595 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.920137 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.920220 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.920241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.920267 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.920285 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:42Z","lastTransitionTime":"2025-10-08T22:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.938639 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.954969 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.972182 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:42 crc kubenswrapper[4834]: I1008 22:23:42.997923 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:42Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.019731 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.024973 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.025012 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.025022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.025040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.025050 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:43Z","lastTransitionTime":"2025-10-08T22:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.035749 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.055565 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.076087 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.089261 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.127680 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.127716 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.127724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.127739 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.127751 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:43Z","lastTransitionTime":"2025-10-08T22:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.230691 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.230769 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.230788 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.230811 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.230827 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:43Z","lastTransitionTime":"2025-10-08T22:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.334696 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.334778 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.334803 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.334836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.334864 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:43Z","lastTransitionTime":"2025-10-08T22:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.439603 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.439702 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.439736 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.439773 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.439798 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:43Z","lastTransitionTime":"2025-10-08T22:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.543274 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.543350 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.543378 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.543406 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.543425 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:43Z","lastTransitionTime":"2025-10-08T22:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.555547 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:43 crc kubenswrapper[4834]: E1008 22:23:43.555758 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.555844 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:43 crc kubenswrapper[4834]: E1008 22:23:43.556049 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.577239 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.598988 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.621613 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.636506 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.646797 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.646842 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.646853 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.646874 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.646888 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:43Z","lastTransitionTime":"2025-10-08T22:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.654895 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.670616 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.685862 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.707253 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.722909 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.749561 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.750395 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.750435 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.750447 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.750470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.750485 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:43Z","lastTransitionTime":"2025-10-08T22:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.771621 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.789837 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerStarted","Data":"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236"} Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.791371 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.792286 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wxzpv" event={"ID":"604cbcf3-0c65-49d1-b7af-4ac41fba5bab","Type":"ContainerStarted","Data":"53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933"} Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.792329 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wxzpv" event={"ID":"604cbcf3-0c65-49d1-b7af-4ac41fba5bab","Type":"ContainerStarted","Data":"07eb8d12328027632758318f91477b89d3340fde92cbe202a080e36f80436f4e"} Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.797666 4834 generic.go:334] "Generic (PLEG): container finished" podID="0ae96bc9-69e0-4f09-b59b-92157a2d5948" containerID="99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c" exitCode=0 Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.797712 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" event={"ID":"0ae96bc9-69e0-4f09-b59b-92157a2d5948","Type":"ContainerDied","Data":"99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c"} Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.809305 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.844239 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.860733 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.860810 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.860826 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.860852 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.860865 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:43Z","lastTransitionTime":"2025-10-08T22:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.869918 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.894173 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.908839 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.924058 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.943694 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.963543 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.963591 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.963608 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.963636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.963654 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:43Z","lastTransitionTime":"2025-10-08T22:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.965180 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.980064 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:43 crc kubenswrapper[4834]: I1008 22:23:43.994647 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:43Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.016771 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.030849 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.050489 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.066207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.066265 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.066278 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.066302 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.066317 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:44Z","lastTransitionTime":"2025-10-08T22:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.066564 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.081219 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.099401 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.113256 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.130252 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.169907 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.169951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.169959 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.169975 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.169986 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:44Z","lastTransitionTime":"2025-10-08T22:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.273550 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.273612 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.273624 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.273644 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.273656 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:44Z","lastTransitionTime":"2025-10-08T22:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.377111 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.377216 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.377242 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.377273 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.377293 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:44Z","lastTransitionTime":"2025-10-08T22:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.480740 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.480807 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.480824 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.480853 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.480873 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:44Z","lastTransitionTime":"2025-10-08T22:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.554886 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:44 crc kubenswrapper[4834]: E1008 22:23:44.555057 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.588837 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.588907 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.588924 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.588954 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.588972 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:44Z","lastTransitionTime":"2025-10-08T22:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.692665 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.692723 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.692735 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.692759 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.692775 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:44Z","lastTransitionTime":"2025-10-08T22:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.796219 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.796273 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.796284 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.796303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.796317 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:44Z","lastTransitionTime":"2025-10-08T22:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.806544 4834 generic.go:334] "Generic (PLEG): container finished" podID="0ae96bc9-69e0-4f09-b59b-92157a2d5948" containerID="300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa" exitCode=0 Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.806662 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" event={"ID":"0ae96bc9-69e0-4f09-b59b-92157a2d5948","Type":"ContainerDied","Data":"300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa"} Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.838656 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.855985 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.878364 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.900590 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.900672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.900692 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.900724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.900747 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:44Z","lastTransitionTime":"2025-10-08T22:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.901692 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.916830 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.930203 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.958041 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:44 crc kubenswrapper[4834]: I1008 22:23:44.986740 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:44Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.003090 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.003185 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.003199 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.003218 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.003281 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:45Z","lastTransitionTime":"2025-10-08T22:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.006793 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.026248 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.043808 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.058718 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.077000 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.090414 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.105892 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.106627 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.106670 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.106682 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.106704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.106718 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:45Z","lastTransitionTime":"2025-10-08T22:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.210478 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.210529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.210540 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.210562 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.210577 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:45Z","lastTransitionTime":"2025-10-08T22:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.314663 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.314857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.314869 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.314893 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.314905 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:45Z","lastTransitionTime":"2025-10-08T22:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.418665 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.418740 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.418759 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.418789 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.418807 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:45Z","lastTransitionTime":"2025-10-08T22:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.522569 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.522662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.522688 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.522725 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.522801 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:45Z","lastTransitionTime":"2025-10-08T22:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.555533 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.555699 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:45 crc kubenswrapper[4834]: E1008 22:23:45.555816 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:23:45 crc kubenswrapper[4834]: E1008 22:23:45.556041 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.626543 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.626620 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.626641 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.626670 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.626693 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:45Z","lastTransitionTime":"2025-10-08T22:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.730140 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.731232 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.731465 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.731632 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.731750 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:45Z","lastTransitionTime":"2025-10-08T22:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.815707 4834 generic.go:334] "Generic (PLEG): container finished" podID="0ae96bc9-69e0-4f09-b59b-92157a2d5948" containerID="e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4" exitCode=0 Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.815781 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" event={"ID":"0ae96bc9-69e0-4f09-b59b-92157a2d5948","Type":"ContainerDied","Data":"e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4"} Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.834100 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.834178 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.834193 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.834212 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.834227 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:45Z","lastTransitionTime":"2025-10-08T22:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.839620 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.866528 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.883834 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.897579 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.911262 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.921764 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.937390 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.937522 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.937538 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.937558 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.937570 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:45Z","lastTransitionTime":"2025-10-08T22:23:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.939904 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.967617 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.980340 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:45 crc kubenswrapper[4834]: I1008 22:23:45.993314 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:45Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.004397 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:46Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.017040 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:46Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.031576 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:46Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.040932 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.041230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.041264 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.041282 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.041295 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:46Z","lastTransitionTime":"2025-10-08T22:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.053909 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:46Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.074672 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:46Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.153625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.153700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.153724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.153755 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.153776 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:46Z","lastTransitionTime":"2025-10-08T22:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.257011 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.257052 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.257064 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.257086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.257100 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:46Z","lastTransitionTime":"2025-10-08T22:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.368310 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.368376 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.368395 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.368429 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.368451 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:46Z","lastTransitionTime":"2025-10-08T22:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.471521 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.471563 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.471574 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.471590 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.471602 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:46Z","lastTransitionTime":"2025-10-08T22:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.555229 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:46 crc kubenswrapper[4834]: E1008 22:23:46.555419 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.574440 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.574484 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.574497 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.574521 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.574539 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:46Z","lastTransitionTime":"2025-10-08T22:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.677790 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.677841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.677851 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.677871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.677883 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:46Z","lastTransitionTime":"2025-10-08T22:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.781630 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.781695 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.781725 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.781760 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.781784 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:46Z","lastTransitionTime":"2025-10-08T22:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.824658 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" event={"ID":"0ae96bc9-69e0-4f09-b59b-92157a2d5948","Type":"ContainerStarted","Data":"78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246"} Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.830266 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerStarted","Data":"bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7"} Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.830818 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.850277 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:46Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.860968 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.861908 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:46Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.876357 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:46Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.884980 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.885020 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.885031 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.885050 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.885063 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:46Z","lastTransitionTime":"2025-10-08T22:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.905419 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:46Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.917774 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:46Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.933115 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:46Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.954455 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:46Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.970524 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:46Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.988988 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:46Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.990550 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.990640 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.990667 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.990700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:46 crc kubenswrapper[4834]: I1008 22:23:46.990725 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:46Z","lastTransitionTime":"2025-10-08T22:23:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.010627 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.035851 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.051722 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.071850 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.090715 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.095020 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.095102 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.095128 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.095198 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.095227 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:47Z","lastTransitionTime":"2025-10-08T22:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.117406 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.138110 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.173482 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.196372 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.198092 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.198161 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.198172 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.198192 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.198205 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:47Z","lastTransitionTime":"2025-10-08T22:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.211544 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.239291 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.258445 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.273426 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.287652 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.300849 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.300916 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.300944 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.300978 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.301004 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:47Z","lastTransitionTime":"2025-10-08T22:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.302864 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.315236 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.330049 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.343609 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.359787 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.391711 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.403866 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.403920 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.403930 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.403953 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.403966 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:47Z","lastTransitionTime":"2025-10-08T22:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.414094 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.435957 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.508131 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.508219 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.508238 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.508267 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.508286 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:47Z","lastTransitionTime":"2025-10-08T22:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.555404 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.555505 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:47 crc kubenswrapper[4834]: E1008 22:23:47.555642 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:23:47 crc kubenswrapper[4834]: E1008 22:23:47.555889 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.611660 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.611755 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.611778 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.611809 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.611828 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:47Z","lastTransitionTime":"2025-10-08T22:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.714838 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.714900 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.714923 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.714955 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.714980 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:47Z","lastTransitionTime":"2025-10-08T22:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.819595 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.819666 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.819687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.819715 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.819738 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:47Z","lastTransitionTime":"2025-10-08T22:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.834110 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.834706 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.882594 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.921048 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.923672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.923745 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.923766 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.923801 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.923821 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:47Z","lastTransitionTime":"2025-10-08T22:23:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.941382 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.964813 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:47 crc kubenswrapper[4834]: I1008 22:23:47.986402 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:47Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.006056 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:48Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.027497 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.027892 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.027910 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.027941 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.027960 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:48Z","lastTransitionTime":"2025-10-08T22:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.033866 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:48Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.060327 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:48Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.084372 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:48Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.106991 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:48Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.127314 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:48Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.130851 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.130902 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.130918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.130940 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.130955 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:48Z","lastTransitionTime":"2025-10-08T22:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.151406 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:48Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.167747 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:48Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.185391 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:48Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.204342 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:48Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.225623 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:48Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.234475 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.234527 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.234546 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.234573 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.234593 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:48Z","lastTransitionTime":"2025-10-08T22:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.337621 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.337857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.337878 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.337906 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.337927 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:48Z","lastTransitionTime":"2025-10-08T22:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.442762 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.442841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.442860 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.442892 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.442911 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:48Z","lastTransitionTime":"2025-10-08T22:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.545994 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.546046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.546064 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.546090 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.546107 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:48Z","lastTransitionTime":"2025-10-08T22:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.555223 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:48 crc kubenswrapper[4834]: E1008 22:23:48.555393 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.653520 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.653567 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.653584 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.653609 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.653626 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:48Z","lastTransitionTime":"2025-10-08T22:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.764959 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.765014 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.765033 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.765060 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.765118 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:48Z","lastTransitionTime":"2025-10-08T22:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.839708 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.869435 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.869490 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.869504 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.869529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.869545 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:48Z","lastTransitionTime":"2025-10-08T22:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.972554 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.972592 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.972603 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.972621 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:48 crc kubenswrapper[4834]: I1008 22:23:48.972635 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:48Z","lastTransitionTime":"2025-10-08T22:23:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.075877 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.075951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.075968 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.075994 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.076011 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:49Z","lastTransitionTime":"2025-10-08T22:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.178460 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.178515 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.178533 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.178556 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.178570 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:49Z","lastTransitionTime":"2025-10-08T22:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.280969 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.281034 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.281048 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.281071 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.281087 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:49Z","lastTransitionTime":"2025-10-08T22:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.289357 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.289764 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:24:05.289744537 +0000 UTC m=+53.112629293 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.384768 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.384840 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.384861 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.384908 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.384928 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:49Z","lastTransitionTime":"2025-10-08T22:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.390765 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.390821 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.390870 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.391066 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.391091 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.391193 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:24:05.391114602 +0000 UTC m=+53.213999388 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.391225 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.391269 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:24:05.391230074 +0000 UTC m=+53.214114900 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.391272 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.391327 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.391398 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 22:24:05.391378938 +0000 UTC m=+53.214263784 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.488162 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.488221 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.488233 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.488482 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.488499 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:49Z","lastTransitionTime":"2025-10-08T22:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.492140 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.492330 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.492360 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.492375 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.492449 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 22:24:05.492425866 +0000 UTC m=+53.315310622 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.557242 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.557377 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.557781 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:49 crc kubenswrapper[4834]: E1008 22:23:49.557847 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.591329 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.591397 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.591429 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.591457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.591476 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:49Z","lastTransitionTime":"2025-10-08T22:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.695225 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.695309 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.695333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.695368 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.695392 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:49Z","lastTransitionTime":"2025-10-08T22:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.798800 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.798846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.798859 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.798879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.798892 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:49Z","lastTransitionTime":"2025-10-08T22:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.843220 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.902988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.903061 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.903080 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.903109 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:49 crc kubenswrapper[4834]: I1008 22:23:49.903128 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:49Z","lastTransitionTime":"2025-10-08T22:23:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.006660 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.006729 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.006745 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.006776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.006794 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:50Z","lastTransitionTime":"2025-10-08T22:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.091312 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.091402 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.091444 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.091477 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.091503 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:50Z","lastTransitionTime":"2025-10-08T22:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: E1008 22:23:50.113367 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:50Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.122747 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.122808 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.122827 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.122854 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.122874 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:50Z","lastTransitionTime":"2025-10-08T22:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: E1008 22:23:50.141083 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:50Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.147694 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.147736 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.147748 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.147771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.147784 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:50Z","lastTransitionTime":"2025-10-08T22:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: E1008 22:23:50.170983 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:50Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.177174 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.177258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.177277 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.177309 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.177330 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:50Z","lastTransitionTime":"2025-10-08T22:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: E1008 22:23:50.193601 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:50Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.198732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.198792 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.198810 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.198841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.198866 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:50Z","lastTransitionTime":"2025-10-08T22:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: E1008 22:23:50.219028 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:50Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:50 crc kubenswrapper[4834]: E1008 22:23:50.219312 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.222291 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.222363 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.222384 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.222414 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.222435 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:50Z","lastTransitionTime":"2025-10-08T22:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.325771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.325843 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.325903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.325940 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.325965 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:50Z","lastTransitionTime":"2025-10-08T22:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.429194 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.429268 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.429288 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.429319 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.429342 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:50Z","lastTransitionTime":"2025-10-08T22:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.534075 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.534172 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.534195 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.534224 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.534245 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:50Z","lastTransitionTime":"2025-10-08T22:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.554862 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:50 crc kubenswrapper[4834]: E1008 22:23:50.555060 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.639199 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.639286 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.639306 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.639336 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.639356 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:50Z","lastTransitionTime":"2025-10-08T22:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.742843 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.742887 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.742898 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.742918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.742928 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:50Z","lastTransitionTime":"2025-10-08T22:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.846601 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.846652 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.846668 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.846695 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.846713 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:50Z","lastTransitionTime":"2025-10-08T22:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.851390 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/0.log" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.855679 4834 generic.go:334] "Generic (PLEG): container finished" podID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerID="bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7" exitCode=1 Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.855742 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerDied","Data":"bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7"} Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.856734 4834 scope.go:117] "RemoveContainer" containerID="bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.882404 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:50Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.899366 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:50Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.923877 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:50Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.950018 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:50Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.953097 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.953169 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.953180 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.953204 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.953216 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:50Z","lastTransitionTime":"2025-10-08T22:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.979304 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:50Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:50 crc kubenswrapper[4834]: I1008 22:23:50.998128 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:50Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.012777 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:51Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.037105 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:51Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.062775 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.062847 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.062871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.063111 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.063186 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:51Z","lastTransitionTime":"2025-10-08T22:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.069974 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:51Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.092681 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:51Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.135911 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:23:49Z\\\",\\\"message\\\":\\\"ent/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.641980 6145 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642006 6145 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642061 6145 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.642176 6145 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.642069 6145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.643120 6145 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 22:23:49.643234 6145 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 22:23:49.643286 6145 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 22:23:49.643299 6145 factory.go:656] Stopping watch factory\\\\nI1008 22:23:49.643321 6145 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:51Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.163190 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:51Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.167555 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.167623 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.167644 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.167673 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.167690 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:51Z","lastTransitionTime":"2025-10-08T22:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.181074 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:51Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.206299 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:51Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.226055 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:51Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.271456 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.271501 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.271515 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.271543 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.271560 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:51Z","lastTransitionTime":"2025-10-08T22:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.374993 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.375061 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.375078 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.375106 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.375127 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:51Z","lastTransitionTime":"2025-10-08T22:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.478597 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.478668 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.478684 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.478714 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.478733 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:51Z","lastTransitionTime":"2025-10-08T22:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.554624 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.554755 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:51 crc kubenswrapper[4834]: E1008 22:23:51.554827 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:23:51 crc kubenswrapper[4834]: E1008 22:23:51.554967 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.581508 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.581849 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.582001 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.582180 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.582330 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:51Z","lastTransitionTime":"2025-10-08T22:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.685367 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.685973 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.685997 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.686024 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.686043 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:51Z","lastTransitionTime":"2025-10-08T22:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.789784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.789860 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.789879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.789910 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.789929 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:51Z","lastTransitionTime":"2025-10-08T22:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.862246 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/0.log" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.865915 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerStarted","Data":"99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746"} Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.892644 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.892784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.892815 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.892885 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:51 crc kubenswrapper[4834]: I1008 22:23:51.892927 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:51Z","lastTransitionTime":"2025-10-08T22:23:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.013856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.013923 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.013941 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.013967 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.013984 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:52Z","lastTransitionTime":"2025-10-08T22:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.118125 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.118205 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.118226 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.118254 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.118274 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:52Z","lastTransitionTime":"2025-10-08T22:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.221930 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.222001 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.222020 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.222052 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.222073 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:52Z","lastTransitionTime":"2025-10-08T22:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.324929 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.325012 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.325033 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.325059 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.325079 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:52Z","lastTransitionTime":"2025-10-08T22:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.428931 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.429003 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.429021 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.429050 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.429067 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:52Z","lastTransitionTime":"2025-10-08T22:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.533286 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.533356 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.533374 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.533405 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.533429 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:52Z","lastTransitionTime":"2025-10-08T22:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.554833 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:52 crc kubenswrapper[4834]: E1008 22:23:52.555186 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.636364 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.636421 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.636446 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.636475 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.636497 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:52Z","lastTransitionTime":"2025-10-08T22:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.738892 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.738993 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.739009 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.739039 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.739055 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:52Z","lastTransitionTime":"2025-10-08T22:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.842861 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.842916 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.842928 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.842950 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.842983 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:52Z","lastTransitionTime":"2025-10-08T22:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.869435 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.892549 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:52Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.905131 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx"] Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.906131 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.909732 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.910258 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.916658 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:52Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.937668 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:52Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.946917 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.946969 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.946983 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.947036 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.947054 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:52Z","lastTransitionTime":"2025-10-08T22:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.954764 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:52Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.969430 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:52Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:52 crc kubenswrapper[4834]: I1008 22:23:52.989456 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:52Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.005906 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.027502 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.033405 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7944e142-203a-405b-bd09-88f3512170e6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8svwx\" (UID: \"7944e142-203a-405b-bd09-88f3512170e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.033515 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7944e142-203a-405b-bd09-88f3512170e6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8svwx\" (UID: \"7944e142-203a-405b-bd09-88f3512170e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.033578 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7944e142-203a-405b-bd09-88f3512170e6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8svwx\" (UID: \"7944e142-203a-405b-bd09-88f3512170e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.033643 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9djz\" (UniqueName: \"kubernetes.io/projected/7944e142-203a-405b-bd09-88f3512170e6-kube-api-access-q9djz\") pod \"ovnkube-control-plane-749d76644c-8svwx\" (UID: \"7944e142-203a-405b-bd09-88f3512170e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.044443 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.051574 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.051615 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.051631 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.051658 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.051676 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:53Z","lastTransitionTime":"2025-10-08T22:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.113191 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.127921 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.135321 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7944e142-203a-405b-bd09-88f3512170e6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8svwx\" (UID: \"7944e142-203a-405b-bd09-88f3512170e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.135373 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7944e142-203a-405b-bd09-88f3512170e6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8svwx\" (UID: \"7944e142-203a-405b-bd09-88f3512170e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.135418 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7944e142-203a-405b-bd09-88f3512170e6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8svwx\" (UID: \"7944e142-203a-405b-bd09-88f3512170e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.135460 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9djz\" (UniqueName: \"kubernetes.io/projected/7944e142-203a-405b-bd09-88f3512170e6-kube-api-access-q9djz\") pod \"ovnkube-control-plane-749d76644c-8svwx\" (UID: \"7944e142-203a-405b-bd09-88f3512170e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.136706 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7944e142-203a-405b-bd09-88f3512170e6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-8svwx\" (UID: \"7944e142-203a-405b-bd09-88f3512170e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.136833 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7944e142-203a-405b-bd09-88f3512170e6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-8svwx\" (UID: \"7944e142-203a-405b-bd09-88f3512170e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.141033 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.148673 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7944e142-203a-405b-bd09-88f3512170e6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-8svwx\" (UID: \"7944e142-203a-405b-bd09-88f3512170e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.151430 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9djz\" (UniqueName: \"kubernetes.io/projected/7944e142-203a-405b-bd09-88f3512170e6-kube-api-access-q9djz\") pod \"ovnkube-control-plane-749d76644c-8svwx\" (UID: \"7944e142-203a-405b-bd09-88f3512170e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.159442 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.159480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.159492 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.159511 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.159524 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:53Z","lastTransitionTime":"2025-10-08T22:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.159726 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:23:49Z\\\",\\\"message\\\":\\\"ent/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.641980 6145 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642006 6145 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642061 6145 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.642176 6145 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.642069 6145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.643120 6145 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 22:23:49.643234 6145 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 22:23:49.643286 6145 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 22:23:49.643299 6145 factory.go:656] Stopping watch factory\\\\nI1008 22:23:49.643321 6145 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.174713 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.187589 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.200584 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.215750 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.232607 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.236124 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.252072 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.262112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.262166 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.262176 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.262200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.262211 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:53Z","lastTransitionTime":"2025-10-08T22:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.266121 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.277450 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.293251 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.318955 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.333767 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.349675 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.365238 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.365922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.365984 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.365999 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.366023 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.366075 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:53Z","lastTransitionTime":"2025-10-08T22:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.380780 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.395086 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.418346 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:23:49Z\\\",\\\"message\\\":\\\"ent/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.641980 6145 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642006 6145 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642061 6145 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.642176 6145 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.642069 6145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.643120 6145 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 22:23:49.643234 6145 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 22:23:49.643286 6145 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 22:23:49.643299 6145 factory.go:656] Stopping watch factory\\\\nI1008 22:23:49.643321 6145 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.435916 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.456667 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.469442 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.469510 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.469531 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.469561 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.469582 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:53Z","lastTransitionTime":"2025-10-08T22:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.554863 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.554895 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:53 crc kubenswrapper[4834]: E1008 22:23:53.555087 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:23:53 crc kubenswrapper[4834]: E1008 22:23:53.555164 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.572449 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.573390 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.573443 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.573455 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.573477 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.573490 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:53Z","lastTransitionTime":"2025-10-08T22:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.591788 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.605750 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.630414 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.650498 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.663633 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.675637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.675680 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.675691 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.675711 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.675723 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:53Z","lastTransitionTime":"2025-10-08T22:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.681190 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.695075 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.714245 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:23:49Z\\\",\\\"message\\\":\\\"ent/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.641980 6145 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642006 6145 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642061 6145 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.642176 6145 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.642069 6145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.643120 6145 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 22:23:49.643234 6145 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 22:23:49.643286 6145 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 22:23:49.643299 6145 factory.go:656] Stopping watch factory\\\\nI1008 22:23:49.643321 6145 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.728409 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.738885 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.751745 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.778283 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.778330 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.778341 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.778365 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.778377 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:53Z","lastTransitionTime":"2025-10-08T22:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.787577 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.818376 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.842125 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.858790 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.874134 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" event={"ID":"7944e142-203a-405b-bd09-88f3512170e6","Type":"ContainerStarted","Data":"bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0"} Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.874420 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" event={"ID":"7944e142-203a-405b-bd09-88f3512170e6","Type":"ContainerStarted","Data":"54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde"} Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.874437 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" event={"ID":"7944e142-203a-405b-bd09-88f3512170e6","Type":"ContainerStarted","Data":"44a8ea08ef07042e337463474dadd991b14d13accaa0d8691065140221ab3558"} Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.876817 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/1.log" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.877394 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/0.log" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.881696 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.881736 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.881749 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.881770 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.881790 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:53Z","lastTransitionTime":"2025-10-08T22:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.882798 4834 generic.go:334] "Generic (PLEG): container finished" podID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerID="99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746" exitCode=1 Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.882853 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerDied","Data":"99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746"} Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.882902 4834 scope.go:117] "RemoveContainer" containerID="bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.883812 4834 scope.go:117] "RemoveContainer" containerID="99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746" Oct 08 22:23:53 crc kubenswrapper[4834]: E1008 22:23:53.884019 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.894272 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.912349 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.929546 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.947100 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:23:49Z\\\",\\\"message\\\":\\\"ent/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.641980 6145 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642006 6145 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642061 6145 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.642176 6145 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.642069 6145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.643120 6145 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 22:23:49.643234 6145 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 22:23:49.643286 6145 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 22:23:49.643299 6145 factory.go:656] Stopping watch factory\\\\nI1008 22:23:49.643321 6145 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.962624 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.975481 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.985112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.985177 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.985190 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.985211 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.985225 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:53Z","lastTransitionTime":"2025-10-08T22:23:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.986667 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:53 crc kubenswrapper[4834]: I1008 22:23:53.999301 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:53Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.011033 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.021755 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-g7fd8"] Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.022431 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:23:54 crc kubenswrapper[4834]: E1008 22:23:54.022527 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.023714 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.037462 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.055926 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.069355 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.088748 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.088804 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.088817 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.088841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.088855 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:54Z","lastTransitionTime":"2025-10-08T22:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.090596 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.113322 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.131672 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.146448 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.150409 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrqgc\" (UniqueName: \"kubernetes.io/projected/e266421d-b52e-42f9-a7db-88f09ba1c075-kube-api-access-nrqgc\") pod \"network-metrics-daemon-g7fd8\" (UID: \"e266421d-b52e-42f9-a7db-88f09ba1c075\") " pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.150486 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs\") pod \"network-metrics-daemon-g7fd8\" (UID: \"e266421d-b52e-42f9-a7db-88f09ba1c075\") " pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.159576 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.172781 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.183035 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.201190 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.214487 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.214548 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.214560 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.214583 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.214596 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:54Z","lastTransitionTime":"2025-10-08T22:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.215862 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.229727 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.241257 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.251618 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrqgc\" (UniqueName: \"kubernetes.io/projected/e266421d-b52e-42f9-a7db-88f09ba1c075-kube-api-access-nrqgc\") pod \"network-metrics-daemon-g7fd8\" (UID: \"e266421d-b52e-42f9-a7db-88f09ba1c075\") " pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.251673 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs\") pod \"network-metrics-daemon-g7fd8\" (UID: \"e266421d-b52e-42f9-a7db-88f09ba1c075\") " pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:23:54 crc kubenswrapper[4834]: E1008 22:23:54.251814 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:23:54 crc kubenswrapper[4834]: E1008 22:23:54.251895 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs podName:e266421d-b52e-42f9-a7db-88f09ba1c075 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:54.751873992 +0000 UTC m=+42.574758738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs") pod "network-metrics-daemon-g7fd8" (UID: "e266421d-b52e-42f9-a7db-88f09ba1c075") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.255615 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.268805 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrqgc\" (UniqueName: \"kubernetes.io/projected/e266421d-b52e-42f9-a7db-88f09ba1c075-kube-api-access-nrqgc\") pod \"network-metrics-daemon-g7fd8\" (UID: \"e266421d-b52e-42f9-a7db-88f09ba1c075\") " pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.271101 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.283803 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.312702 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:23:49Z\\\",\\\"message\\\":\\\"ent/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.641980 6145 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642006 6145 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642061 6145 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.642176 6145 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.642069 6145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.643120 6145 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 22:23:49.643234 6145 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 22:23:49.643286 6145 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 22:23:49.643299 6145 factory.go:656] Stopping watch factory\\\\nI1008 22:23:49.643321 6145 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"message\\\":\\\"hift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.045736 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 22:23:53.045792 6307 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1008 22:23:53.045802 6307 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1008 22:23:53.045861 6307 factory.go:656] Stopping watch factory\\\\nI1008 22:23:53.045905 6307 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1008 22:23:53.045925 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 22:23:53.045941 6307 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 22:23:53.045997 6307 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:53.046061 6307 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.046196 6307 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.046299 6307 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.317670 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.317949 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.318051 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.318195 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.318319 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:54Z","lastTransitionTime":"2025-10-08T22:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.330248 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.342278 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.357258 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.374221 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.387447 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:23:54Z is after 2025-08-24T17:21:41Z" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.421201 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.421233 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.421241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.421259 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.421270 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:54Z","lastTransitionTime":"2025-10-08T22:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.524334 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.524393 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.524403 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.524422 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.524432 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:54Z","lastTransitionTime":"2025-10-08T22:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.555099 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:54 crc kubenswrapper[4834]: E1008 22:23:54.555461 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.628281 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.628853 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.629084 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.629315 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.629508 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:54Z","lastTransitionTime":"2025-10-08T22:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.732728 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.732792 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.732810 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.732834 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.732848 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:54Z","lastTransitionTime":"2025-10-08T22:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.759408 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs\") pod \"network-metrics-daemon-g7fd8\" (UID: \"e266421d-b52e-42f9-a7db-88f09ba1c075\") " pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:23:54 crc kubenswrapper[4834]: E1008 22:23:54.759619 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:23:54 crc kubenswrapper[4834]: E1008 22:23:54.760444 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs podName:e266421d-b52e-42f9-a7db-88f09ba1c075 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:55.760410068 +0000 UTC m=+43.583294844 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs") pod "network-metrics-daemon-g7fd8" (UID: "e266421d-b52e-42f9-a7db-88f09ba1c075") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.836276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.836328 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.836338 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.836356 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.836370 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:54Z","lastTransitionTime":"2025-10-08T22:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.888801 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/1.log" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.941987 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.942050 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.942062 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.942082 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:54 crc kubenswrapper[4834]: I1008 22:23:54.942094 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:54Z","lastTransitionTime":"2025-10-08T22:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.045043 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.045105 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.045122 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.045177 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.045191 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:55Z","lastTransitionTime":"2025-10-08T22:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.147980 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.148029 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.148039 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.148058 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.148069 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:55Z","lastTransitionTime":"2025-10-08T22:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.251129 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.251405 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.251419 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.251438 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.251453 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:55Z","lastTransitionTime":"2025-10-08T22:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.354005 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.354037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.354047 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.354063 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.354095 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:55Z","lastTransitionTime":"2025-10-08T22:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.456845 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.456908 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.456920 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.456937 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.456948 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:55Z","lastTransitionTime":"2025-10-08T22:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.554524 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.554698 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:23:55 crc kubenswrapper[4834]: E1008 22:23:55.554792 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.554887 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:55 crc kubenswrapper[4834]: E1008 22:23:55.555121 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:23:55 crc kubenswrapper[4834]: E1008 22:23:55.555246 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.559311 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.559337 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.559345 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.559360 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.559369 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:55Z","lastTransitionTime":"2025-10-08T22:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.665031 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.665683 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.665698 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.665724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.665739 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:55Z","lastTransitionTime":"2025-10-08T22:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.769224 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.769262 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.769272 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.769290 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.769302 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:55Z","lastTransitionTime":"2025-10-08T22:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.770108 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs\") pod \"network-metrics-daemon-g7fd8\" (UID: \"e266421d-b52e-42f9-a7db-88f09ba1c075\") " pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:23:55 crc kubenswrapper[4834]: E1008 22:23:55.770279 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:23:55 crc kubenswrapper[4834]: E1008 22:23:55.770343 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs podName:e266421d-b52e-42f9-a7db-88f09ba1c075 nodeName:}" failed. No retries permitted until 2025-10-08 22:23:57.770317889 +0000 UTC m=+45.593202845 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs") pod "network-metrics-daemon-g7fd8" (UID: "e266421d-b52e-42f9-a7db-88f09ba1c075") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.872645 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.872705 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.872717 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.872740 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.872753 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:55Z","lastTransitionTime":"2025-10-08T22:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.975939 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.975986 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.975996 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.976012 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:55 crc kubenswrapper[4834]: I1008 22:23:55.976023 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:55Z","lastTransitionTime":"2025-10-08T22:23:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.079115 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.079206 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.079222 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.079241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.079253 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:56Z","lastTransitionTime":"2025-10-08T22:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.182402 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.182467 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.182490 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.182519 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.182539 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:56Z","lastTransitionTime":"2025-10-08T22:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.285588 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.285641 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.285654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.285672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.285682 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:56Z","lastTransitionTime":"2025-10-08T22:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.388570 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.388650 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.388670 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.388703 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.388728 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:56Z","lastTransitionTime":"2025-10-08T22:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.492784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.492878 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.492897 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.492964 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.493198 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:56Z","lastTransitionTime":"2025-10-08T22:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.555274 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:56 crc kubenswrapper[4834]: E1008 22:23:56.555433 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.596594 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.596652 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.596664 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.596687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.596701 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:56Z","lastTransitionTime":"2025-10-08T22:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.699762 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.699803 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.699811 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.699828 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.699839 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:56Z","lastTransitionTime":"2025-10-08T22:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.802346 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.802428 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.802456 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.802487 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.802506 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:56Z","lastTransitionTime":"2025-10-08T22:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.904860 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.904917 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.904927 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.904951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:56 crc kubenswrapper[4834]: I1008 22:23:56.904962 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:56Z","lastTransitionTime":"2025-10-08T22:23:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.008007 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.008243 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.008256 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.008277 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.008287 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:57Z","lastTransitionTime":"2025-10-08T22:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.111639 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.111686 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.111696 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.111711 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.111721 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:57Z","lastTransitionTime":"2025-10-08T22:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.214241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.214301 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.214313 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.214333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.214347 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:57Z","lastTransitionTime":"2025-10-08T22:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.317466 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.317497 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.317510 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.317527 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.317539 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:57Z","lastTransitionTime":"2025-10-08T22:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.420583 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.420633 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.420643 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.420663 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.420679 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:57Z","lastTransitionTime":"2025-10-08T22:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.523669 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.523725 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.523740 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.523763 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.523778 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:57Z","lastTransitionTime":"2025-10-08T22:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.555478 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.555568 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.555766 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:57 crc kubenswrapper[4834]: E1008 22:23:57.556307 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:23:57 crc kubenswrapper[4834]: E1008 22:23:57.556585 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:23:57 crc kubenswrapper[4834]: E1008 22:23:57.556858 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.630763 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.630833 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.630850 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.630879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.630896 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:57Z","lastTransitionTime":"2025-10-08T22:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.733735 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.733794 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.733812 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.733836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.733856 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:57Z","lastTransitionTime":"2025-10-08T22:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.797836 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs\") pod \"network-metrics-daemon-g7fd8\" (UID: \"e266421d-b52e-42f9-a7db-88f09ba1c075\") " pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:23:57 crc kubenswrapper[4834]: E1008 22:23:57.798090 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:23:57 crc kubenswrapper[4834]: E1008 22:23:57.798255 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs podName:e266421d-b52e-42f9-a7db-88f09ba1c075 nodeName:}" failed. No retries permitted until 2025-10-08 22:24:01.798216413 +0000 UTC m=+49.621101199 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs") pod "network-metrics-daemon-g7fd8" (UID: "e266421d-b52e-42f9-a7db-88f09ba1c075") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.837667 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.837737 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.837761 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.838382 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.838601 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:57Z","lastTransitionTime":"2025-10-08T22:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.942381 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.942463 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.942480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.942546 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:57 crc kubenswrapper[4834]: I1008 22:23:57.942564 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:57Z","lastTransitionTime":"2025-10-08T22:23:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.046608 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.046668 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.046685 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.046711 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.046728 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:58Z","lastTransitionTime":"2025-10-08T22:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.149666 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.149724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.149737 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.149760 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.149775 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:58Z","lastTransitionTime":"2025-10-08T22:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.252473 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.252502 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.252513 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.252529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.252541 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:58Z","lastTransitionTime":"2025-10-08T22:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.357201 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.357258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.357272 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.357292 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.357304 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:58Z","lastTransitionTime":"2025-10-08T22:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.460504 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.460942 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.461072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.461276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.461408 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:58Z","lastTransitionTime":"2025-10-08T22:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.554891 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:23:58 crc kubenswrapper[4834]: E1008 22:23:58.555079 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.564671 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.564719 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.564737 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.564760 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.564776 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:58Z","lastTransitionTime":"2025-10-08T22:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.667950 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.668019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.668032 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.668053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.668067 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:58Z","lastTransitionTime":"2025-10-08T22:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.771462 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.771618 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.771644 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.771676 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.771699 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:58Z","lastTransitionTime":"2025-10-08T22:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.874984 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.875032 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.875043 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.875063 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.875078 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:58Z","lastTransitionTime":"2025-10-08T22:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.978282 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.978361 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.978383 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.978416 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:58 crc kubenswrapper[4834]: I1008 22:23:58.978439 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:58Z","lastTransitionTime":"2025-10-08T22:23:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.081052 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.081096 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.081109 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.081132 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.081166 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:59Z","lastTransitionTime":"2025-10-08T22:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.184363 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.184411 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.184425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.184445 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.184458 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:59Z","lastTransitionTime":"2025-10-08T22:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.287956 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.288017 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.288036 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.288063 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.288084 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:59Z","lastTransitionTime":"2025-10-08T22:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.392126 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.392201 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.392216 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.392236 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.392250 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:59Z","lastTransitionTime":"2025-10-08T22:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.496070 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.496256 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.496299 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.496337 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.496360 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:59Z","lastTransitionTime":"2025-10-08T22:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.556554 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.556779 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:23:59 crc kubenswrapper[4834]: E1008 22:23:59.557062 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:23:59 crc kubenswrapper[4834]: E1008 22:23:59.557292 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.558086 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:23:59 crc kubenswrapper[4834]: E1008 22:23:59.558354 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.599418 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.599471 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.599487 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.599512 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.599530 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:59Z","lastTransitionTime":"2025-10-08T22:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.702595 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.702651 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.702672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.702699 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.702718 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:59Z","lastTransitionTime":"2025-10-08T22:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.807644 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.807694 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.807705 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.807725 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.807738 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:59Z","lastTransitionTime":"2025-10-08T22:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.911962 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.912088 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.912142 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.912245 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:23:59 crc kubenswrapper[4834]: I1008 22:23:59.912270 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:23:59Z","lastTransitionTime":"2025-10-08T22:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.016958 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.017036 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.017054 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.017602 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.017720 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.121766 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.121842 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.121862 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.121892 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.121911 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.225709 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.225793 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.225816 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.225849 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.225931 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.329344 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.329413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.329438 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.329474 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.329496 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.342256 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.342312 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.342323 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.342346 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.342357 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: E1008 22:24:00.366544 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:00Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.370920 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.370974 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.370990 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.371008 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.371021 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: E1008 22:24:00.391727 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:00Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.396560 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.396639 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.396664 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.396700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.396724 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: E1008 22:24:00.418746 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:00Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.423773 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.423809 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.423819 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.423842 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.423856 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: E1008 22:24:00.443209 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:00Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.447997 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.448065 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.448086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.448117 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.448137 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: E1008 22:24:00.472585 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:00Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:00 crc kubenswrapper[4834]: E1008 22:24:00.472728 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.475120 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.475163 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.475173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.475192 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.475206 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.554847 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:00 crc kubenswrapper[4834]: E1008 22:24:00.554982 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.578592 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.578660 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.578678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.578704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.578723 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.682997 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.683069 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.683088 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.683119 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.683177 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.786280 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.786369 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.786390 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.786418 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.786441 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.889986 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.890054 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.890071 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.890103 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.890122 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.993653 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.993729 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.993742 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.993764 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:00 crc kubenswrapper[4834]: I1008 22:24:00.993777 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:00Z","lastTransitionTime":"2025-10-08T22:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.096299 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.096358 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.096375 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.096399 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.096412 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:01Z","lastTransitionTime":"2025-10-08T22:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.200612 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.200672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.200695 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.200728 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.200753 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:01Z","lastTransitionTime":"2025-10-08T22:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.304942 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.305017 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.305034 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.305063 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.305081 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:01Z","lastTransitionTime":"2025-10-08T22:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.409432 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.409513 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.409533 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.409562 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.409580 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:01Z","lastTransitionTime":"2025-10-08T22:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.513615 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.513675 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.513692 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.513718 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.513778 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:01Z","lastTransitionTime":"2025-10-08T22:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.554601 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.554816 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:01 crc kubenswrapper[4834]: E1008 22:24:01.555018 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.555051 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:01 crc kubenswrapper[4834]: E1008 22:24:01.555256 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:01 crc kubenswrapper[4834]: E1008 22:24:01.555633 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.616907 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.616966 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.616987 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.617016 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.617036 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:01Z","lastTransitionTime":"2025-10-08T22:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.721663 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.721743 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.721762 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.721792 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.721812 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:01Z","lastTransitionTime":"2025-10-08T22:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.824592 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.824641 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.824661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.824690 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.824709 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:01Z","lastTransitionTime":"2025-10-08T22:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.843564 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs\") pod \"network-metrics-daemon-g7fd8\" (UID: \"e266421d-b52e-42f9-a7db-88f09ba1c075\") " pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:01 crc kubenswrapper[4834]: E1008 22:24:01.843829 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:24:01 crc kubenswrapper[4834]: E1008 22:24:01.844286 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs podName:e266421d-b52e-42f9-a7db-88f09ba1c075 nodeName:}" failed. No retries permitted until 2025-10-08 22:24:09.844249602 +0000 UTC m=+57.667134388 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs") pod "network-metrics-daemon-g7fd8" (UID: "e266421d-b52e-42f9-a7db-88f09ba1c075") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.927625 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.927987 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.928233 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.928493 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:01 crc kubenswrapper[4834]: I1008 22:24:01.928839 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:01Z","lastTransitionTime":"2025-10-08T22:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.034265 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.034331 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.034349 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.034378 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.034399 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:02Z","lastTransitionTime":"2025-10-08T22:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.137910 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.137991 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.138004 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.138025 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.138037 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:02Z","lastTransitionTime":"2025-10-08T22:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.242481 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.242811 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.242880 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.242985 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.243043 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:02Z","lastTransitionTime":"2025-10-08T22:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.346529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.346889 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.347385 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.347714 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.347919 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:02Z","lastTransitionTime":"2025-10-08T22:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.452098 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.452134 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.452168 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.452231 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.452247 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:02Z","lastTransitionTime":"2025-10-08T22:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.554541 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:02 crc kubenswrapper[4834]: E1008 22:24:02.555009 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.556105 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.556202 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.556222 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.556253 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.556274 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:02Z","lastTransitionTime":"2025-10-08T22:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.658423 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.658499 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.658518 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.658550 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.658579 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:02Z","lastTransitionTime":"2025-10-08T22:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.763548 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.763632 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.763655 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.763703 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.763732 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:02Z","lastTransitionTime":"2025-10-08T22:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.867135 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.867200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.867212 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.867234 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.867247 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:02Z","lastTransitionTime":"2025-10-08T22:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.881693 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.893713 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.910842 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:02Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.931933 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:02Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.952880 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:02Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.968877 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:02Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.969838 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.969918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.969946 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.969982 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.970005 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:02Z","lastTransitionTime":"2025-10-08T22:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:02 crc kubenswrapper[4834]: I1008 22:24:02.990354 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:02Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.007003 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.042741 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:23:49Z\\\",\\\"message\\\":\\\"ent/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.641980 6145 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642006 6145 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642061 6145 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.642176 6145 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.642069 6145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.643120 6145 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 22:23:49.643234 6145 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 22:23:49.643286 6145 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 22:23:49.643299 6145 factory.go:656] Stopping watch factory\\\\nI1008 22:23:49.643321 6145 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"message\\\":\\\"hift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.045736 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 22:23:53.045792 6307 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1008 22:23:53.045802 6307 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1008 22:23:53.045861 6307 factory.go:656] Stopping watch factory\\\\nI1008 22:23:53.045905 6307 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1008 22:23:53.045925 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 22:23:53.045941 6307 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 22:23:53.045997 6307 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:53.046061 6307 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.046196 6307 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.046299 6307 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.061798 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.073073 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.073130 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.073179 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.073209 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.073226 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:03Z","lastTransitionTime":"2025-10-08T22:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.080686 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.097844 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.115572 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.134368 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.151613 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.168214 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.175773 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.175865 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.175893 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.175933 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.175965 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:03Z","lastTransitionTime":"2025-10-08T22:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.180854 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.199832 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.214697 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.279432 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.279486 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.279500 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.279525 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.279538 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:03Z","lastTransitionTime":"2025-10-08T22:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.383848 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.383924 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.383950 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.383984 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.384009 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:03Z","lastTransitionTime":"2025-10-08T22:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.487605 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.487688 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.487707 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.487736 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.487756 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:03Z","lastTransitionTime":"2025-10-08T22:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.555323 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.555323 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:03 crc kubenswrapper[4834]: E1008 22:24:03.555557 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:03 crc kubenswrapper[4834]: E1008 22:24:03.555702 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.555493 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:03 crc kubenswrapper[4834]: E1008 22:24:03.555958 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.577375 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.590643 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.590711 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.590726 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.590749 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.590765 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:03Z","lastTransitionTime":"2025-10-08T22:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.593670 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.632262 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb54ed63880b900749ea35521f27016597e4d443ffcb563ce857d49a782e0db7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:23:49Z\\\",\\\"message\\\":\\\"ent/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.641980 6145 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642006 6145 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:49.642061 6145 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.642176 6145 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1008 22:23:49.642069 6145 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:49.643120 6145 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1008 22:23:49.643234 6145 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 22:23:49.643286 6145 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1008 22:23:49.643299 6145 factory.go:656] Stopping watch factory\\\\nI1008 22:23:49.643321 6145 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"message\\\":\\\"hift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.045736 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 22:23:53.045792 6307 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1008 22:23:53.045802 6307 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1008 22:23:53.045861 6307 factory.go:656] Stopping watch factory\\\\nI1008 22:23:53.045905 6307 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1008 22:23:53.045925 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 22:23:53.045941 6307 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 22:23:53.045997 6307 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:53.046061 6307 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.046196 6307 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.046299 6307 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.657728 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.679362 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.694416 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.694473 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.694492 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.694573 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.694594 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:03Z","lastTransitionTime":"2025-10-08T22:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.710944 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.727405 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.746858 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.765274 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.780163 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.795819 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399e10bc-a6d4-4f8a-b205-79fd2639b6ce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaded9d1a4ba5fc97b04ac0968d3abdc4f389864754064554e59a719869ca3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f15a41148009702a77c601901941b8e079f0a12ddf642fa6cb762dc71be69bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609506e03402de713e44a88255d61dc4d41513a9218e8e13a60a287db447bafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.798686 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.798746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.798765 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.798796 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.798814 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:03Z","lastTransitionTime":"2025-10-08T22:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.810704 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.827184 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.842659 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.856183 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.876052 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.891854 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.901609 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.901651 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.901662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.901681 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.901694 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:03Z","lastTransitionTime":"2025-10-08T22:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:03 crc kubenswrapper[4834]: I1008 22:24:03.906680 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:03Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.004470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.004518 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.004528 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.004545 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.004560 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:04Z","lastTransitionTime":"2025-10-08T22:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.108870 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.108956 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.108974 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.109002 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.109022 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:04Z","lastTransitionTime":"2025-10-08T22:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.212338 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.212398 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.212416 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.212444 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.212462 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:04Z","lastTransitionTime":"2025-10-08T22:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.316402 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.316470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.316490 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.316519 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.316540 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:04Z","lastTransitionTime":"2025-10-08T22:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.329849 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.331699 4834 scope.go:117] "RemoveContainer" containerID="99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.356189 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.379291 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.398723 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.419067 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.420835 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.420907 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.420929 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.420959 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.420979 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:04Z","lastTransitionTime":"2025-10-08T22:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.450727 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.466914 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.487492 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.524757 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.524940 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.525010 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.525079 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.525174 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:04Z","lastTransitionTime":"2025-10-08T22:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.525961 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.555513 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:04 crc kubenswrapper[4834]: E1008 22:24:04.556137 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.558131 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.584271 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.603543 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.618733 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.628530 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.628558 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.628568 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.628587 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.628600 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:04Z","lastTransitionTime":"2025-10-08T22:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.640222 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"message\\\":\\\"hift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.045736 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 22:23:53.045792 6307 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1008 22:23:53.045802 6307 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1008 22:23:53.045861 6307 factory.go:656] Stopping watch factory\\\\nI1008 22:23:53.045905 6307 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1008 22:23:53.045925 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 22:23:53.045941 6307 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 22:23:53.045997 6307 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:53.046061 6307 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.046196 6307 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.046299 6307 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.651004 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.664646 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399e10bc-a6d4-4f8a-b205-79fd2639b6ce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaded9d1a4ba5fc97b04ac0968d3abdc4f389864754064554e59a719869ca3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f15a41148009702a77c601901941b8e079f0a12ddf642fa6cb762dc71be69bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609506e03402de713e44a88255d61dc4d41513a9218e8e13a60a287db447bafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.677516 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.699209 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.716424 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.731645 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.731727 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.731748 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.731786 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.731807 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:04Z","lastTransitionTime":"2025-10-08T22:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.834597 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.834636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.834646 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.834663 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.834677 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:04Z","lastTransitionTime":"2025-10-08T22:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.938297 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.938342 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.938353 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.938371 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.938382 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:04Z","lastTransitionTime":"2025-10-08T22:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.943105 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/1.log" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.946650 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerStarted","Data":"deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c"} Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.947403 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.964735 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:04 crc kubenswrapper[4834]: I1008 22:24:04.984629 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:04Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.004027 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399e10bc-a6d4-4f8a-b205-79fd2639b6ce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaded9d1a4ba5fc97b04ac0968d3abdc4f389864754064554e59a719869ca3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f15a41148009702a77c601901941b8e079f0a12ddf642fa6cb762dc71be69bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609506e03402de713e44a88255d61dc4d41513a9218e8e13a60a287db447bafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.024101 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.040872 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.040938 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.040955 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.040976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.040992 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:05Z","lastTransitionTime":"2025-10-08T22:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.043132 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.059872 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.080280 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.101608 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.124975 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.145095 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.145158 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.145172 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.145190 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.145204 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:05Z","lastTransitionTime":"2025-10-08T22:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.157495 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.198018 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.214957 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.241586 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"message\\\":\\\"hift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.045736 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 22:23:53.045792 6307 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1008 22:23:53.045802 6307 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1008 22:23:53.045861 6307 factory.go:656] Stopping watch factory\\\\nI1008 22:23:53.045905 6307 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1008 22:23:53.045925 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 22:23:53.045941 6307 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 22:23:53.045997 6307 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:53.046061 6307 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.046196 6307 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.046299 6307 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:24:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.254039 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.254078 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.254087 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.254103 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.254113 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:05Z","lastTransitionTime":"2025-10-08T22:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.269935 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.285843 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.302879 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.317534 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.337644 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.356879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.356956 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.356973 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.357432 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.357490 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:05Z","lastTransitionTime":"2025-10-08T22:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.388808 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.389039 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:24:37.389004177 +0000 UTC m=+85.211888923 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.461375 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.461427 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.461438 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.461459 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.461472 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:05Z","lastTransitionTime":"2025-10-08T22:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.490542 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.490607 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.490662 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.490723 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.490832 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.490866 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:24:37.490832793 +0000 UTC m=+85.313717569 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.490911 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:24:37.490885135 +0000 UTC m=+85.313769911 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.491072 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.491138 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.491207 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.491313 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 22:24:37.491278083 +0000 UTC m=+85.314162969 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.555614 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.555764 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.555891 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.556041 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.555642 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.556260 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.564705 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.564778 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.564798 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.564832 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.564851 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:05Z","lastTransitionTime":"2025-10-08T22:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.591761 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.592117 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.592207 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.592226 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.592320 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 22:24:37.592291861 +0000 UTC m=+85.415176617 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.668378 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.668581 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.668600 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.668628 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.668649 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:05Z","lastTransitionTime":"2025-10-08T22:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.772406 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.772458 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.772467 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.772487 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.772497 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:05Z","lastTransitionTime":"2025-10-08T22:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.876163 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.876228 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.876247 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.876275 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.876293 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:05Z","lastTransitionTime":"2025-10-08T22:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.955024 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/2.log" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.955875 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/1.log" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.961286 4834 generic.go:334] "Generic (PLEG): container finished" podID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerID="deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c" exitCode=1 Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.961372 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerDied","Data":"deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c"} Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.961463 4834 scope.go:117] "RemoveContainer" containerID="99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.963368 4834 scope.go:117] "RemoveContainer" containerID="deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c" Oct 08 22:24:05 crc kubenswrapper[4834]: E1008 22:24:05.963723 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.979402 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.979439 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.979451 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.979472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.979488 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:05Z","lastTransitionTime":"2025-10-08T22:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:05 crc kubenswrapper[4834]: I1008 22:24:05.990047 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.013031 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.033564 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.056911 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.077701 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.084803 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.084853 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.084862 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.084881 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.084892 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:06Z","lastTransitionTime":"2025-10-08T22:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.097600 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.121949 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f23210ecaa7cf7f90c624a58456fbf8387fc84df3780540b338e73434f2746\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"message\\\":\\\"hift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.045736 6307 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1008 22:23:53.045792 6307 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1008 22:23:53.045802 6307 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1008 22:23:53.045861 6307 factory.go:656] Stopping watch factory\\\\nI1008 22:23:53.045905 6307 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1008 22:23:53.045925 6307 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1008 22:23:53.045941 6307 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1008 22:23:53.045997 6307 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1008 22:23:53.046061 6307 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.046196 6307 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1008 22:23:53.046299 6307 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:05Z\\\",\\\"message\\\":\\\"ry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-wxzpv\\\\nF1008 22:24:05.381663 6503 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 22:24:05.381672 6503 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-wrrs9\\\\nI1008 22:24:05.381571 6503 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1008 22:24:05.381593 6503 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1008 22:24:05.381696 6503 obj_retry.go:303] Retry ob\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:24:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.144910 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.161614 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.178567 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399e10bc-a6d4-4f8a-b205-79fd2639b6ce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaded9d1a4ba5fc97b04ac0968d3abdc4f389864754064554e59a719869ca3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f15a41148009702a77c601901941b8e079f0a12ddf642fa6cb762dc71be69bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609506e03402de713e44a88255d61dc4d41513a9218e8e13a60a287db447bafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.190472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.190559 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.190583 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.190616 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.190635 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:06Z","lastTransitionTime":"2025-10-08T22:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.200466 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.216726 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.231635 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.244161 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.258271 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.269393 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.280921 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.293558 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.293616 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.293633 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.293656 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.293672 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:06Z","lastTransitionTime":"2025-10-08T22:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.295481 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.395848 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.395897 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.395908 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.395927 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.395941 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:06Z","lastTransitionTime":"2025-10-08T22:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.499498 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.499541 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.499557 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.499576 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.499588 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:06Z","lastTransitionTime":"2025-10-08T22:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.554855 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:06 crc kubenswrapper[4834]: E1008 22:24:06.555066 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.602244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.602312 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.602328 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.602348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.602362 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:06Z","lastTransitionTime":"2025-10-08T22:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.705996 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.706065 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.706085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.706118 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.706163 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:06Z","lastTransitionTime":"2025-10-08T22:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.808339 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.808422 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.808440 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.808467 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.808486 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:06Z","lastTransitionTime":"2025-10-08T22:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.911074 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.911112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.911122 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.911137 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.911162 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:06Z","lastTransitionTime":"2025-10-08T22:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.966846 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/2.log" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.972100 4834 scope.go:117] "RemoveContainer" containerID="deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c" Oct 08 22:24:06 crc kubenswrapper[4834]: E1008 22:24:06.972344 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.987959 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:06 crc kubenswrapper[4834]: I1008 22:24:06.998703 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:06Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.012194 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.014004 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.014034 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.014045 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.014064 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.014078 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:07Z","lastTransitionTime":"2025-10-08T22:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.024291 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.049581 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.063133 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.077807 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.093335 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.114138 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.117529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.117556 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.117565 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.117581 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.117591 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:07Z","lastTransitionTime":"2025-10-08T22:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.139029 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.157721 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.190903 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:05Z\\\",\\\"message\\\":\\\"ry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-wxzpv\\\\nF1008 22:24:05.381663 6503 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 22:24:05.381672 6503 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-wrrs9\\\\nI1008 22:24:05.381571 6503 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1008 22:24:05.381593 6503 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1008 22:24:05.381696 6503 obj_retry.go:303] Retry ob\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:24:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.210369 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.221135 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.221235 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.221251 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.221275 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.221297 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:07Z","lastTransitionTime":"2025-10-08T22:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.231888 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399e10bc-a6d4-4f8a-b205-79fd2639b6ce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaded9d1a4ba5fc97b04ac0968d3abdc4f389864754064554e59a719869ca3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f15a41148009702a77c601901941b8e079f0a12ddf642fa6cb762dc71be69bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609506e03402de713e44a88255d61dc4d41513a9218e8e13a60a287db447bafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.251670 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.268542 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.290670 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.305348 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:07Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.324068 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.324124 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.324173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.324200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.324219 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:07Z","lastTransitionTime":"2025-10-08T22:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.426455 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.426515 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.426533 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.426558 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.426577 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:07Z","lastTransitionTime":"2025-10-08T22:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.530169 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.530227 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.530245 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.530268 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.530283 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:07Z","lastTransitionTime":"2025-10-08T22:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.554871 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.554916 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.554973 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:07 crc kubenswrapper[4834]: E1008 22:24:07.555068 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:07 crc kubenswrapper[4834]: E1008 22:24:07.555208 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:07 crc kubenswrapper[4834]: E1008 22:24:07.555376 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.634078 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.634192 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.634211 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.634241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.634261 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:07Z","lastTransitionTime":"2025-10-08T22:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.737102 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.737210 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.737235 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.737269 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.737293 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:07Z","lastTransitionTime":"2025-10-08T22:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.840596 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.840676 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.840699 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.840732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.840755 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:07Z","lastTransitionTime":"2025-10-08T22:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.944696 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.944771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.944790 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.944821 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:07 crc kubenswrapper[4834]: I1008 22:24:07.944840 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:07Z","lastTransitionTime":"2025-10-08T22:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.047913 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.047976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.047994 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.048017 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.048036 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:08Z","lastTransitionTime":"2025-10-08T22:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.151662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.151717 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.151732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.151757 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.151772 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:08Z","lastTransitionTime":"2025-10-08T22:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.256050 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.256130 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.256191 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.256218 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.256235 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:08Z","lastTransitionTime":"2025-10-08T22:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.359427 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.359528 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.359561 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.359596 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.359620 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:08Z","lastTransitionTime":"2025-10-08T22:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.463544 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.463623 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.463650 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.463677 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.463694 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:08Z","lastTransitionTime":"2025-10-08T22:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.555362 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:08 crc kubenswrapper[4834]: E1008 22:24:08.555597 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.568611 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.568675 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.568736 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.568765 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.568818 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:08Z","lastTransitionTime":"2025-10-08T22:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.672219 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.672282 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.672303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.672333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.672354 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:08Z","lastTransitionTime":"2025-10-08T22:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.776267 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.776344 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.776365 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.776397 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.776417 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:08Z","lastTransitionTime":"2025-10-08T22:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.880492 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.880587 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.880606 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.880642 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.880664 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:08Z","lastTransitionTime":"2025-10-08T22:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.983538 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.983595 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.983614 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.983644 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:08 crc kubenswrapper[4834]: I1008 22:24:08.983663 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:08Z","lastTransitionTime":"2025-10-08T22:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.087428 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.087492 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.087506 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.087532 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.087546 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:09Z","lastTransitionTime":"2025-10-08T22:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.190329 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.190373 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.190406 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.190425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.190434 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:09Z","lastTransitionTime":"2025-10-08T22:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.293973 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.294049 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.294069 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.294097 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.294118 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:09Z","lastTransitionTime":"2025-10-08T22:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.397709 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.397771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.397787 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.397812 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.397827 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:09Z","lastTransitionTime":"2025-10-08T22:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.500863 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.500931 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.500952 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.500981 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.501002 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:09Z","lastTransitionTime":"2025-10-08T22:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.555088 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:09 crc kubenswrapper[4834]: E1008 22:24:09.555264 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.555471 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:09 crc kubenswrapper[4834]: E1008 22:24:09.555521 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.555709 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:09 crc kubenswrapper[4834]: E1008 22:24:09.555772 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.604845 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.604892 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.604901 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.604922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.604933 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:09Z","lastTransitionTime":"2025-10-08T22:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.708874 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.708948 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.708966 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.708994 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.709013 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:09Z","lastTransitionTime":"2025-10-08T22:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.812360 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.812412 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.812425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.812447 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.812465 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:09Z","lastTransitionTime":"2025-10-08T22:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.916480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.916552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.916572 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.916605 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.916625 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:09Z","lastTransitionTime":"2025-10-08T22:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:09 crc kubenswrapper[4834]: I1008 22:24:09.944492 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs\") pod \"network-metrics-daemon-g7fd8\" (UID: \"e266421d-b52e-42f9-a7db-88f09ba1c075\") " pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:09 crc kubenswrapper[4834]: E1008 22:24:09.944738 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:24:09 crc kubenswrapper[4834]: E1008 22:24:09.944880 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs podName:e266421d-b52e-42f9-a7db-88f09ba1c075 nodeName:}" failed. No retries permitted until 2025-10-08 22:24:25.944844266 +0000 UTC m=+73.767729052 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs") pod "network-metrics-daemon-g7fd8" (UID: "e266421d-b52e-42f9-a7db-88f09ba1c075") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.019932 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.019989 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.020010 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.020037 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.020055 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.124166 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.124213 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.124224 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.124245 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.124260 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.226974 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.227028 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.227041 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.227093 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.227111 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.331138 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.331231 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.331245 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.331268 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.331287 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.434565 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.434626 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.434644 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.434679 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.434703 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.537573 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.537662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.537690 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.537729 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.537756 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.555531 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:10 crc kubenswrapper[4834]: E1008 22:24:10.555793 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.642336 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.642421 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.642446 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.642482 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.642510 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.745409 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.745462 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.745473 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.745493 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.745508 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.746826 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.746885 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.746911 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.746943 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.746966 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:10 crc kubenswrapper[4834]: E1008 22:24:10.770007 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:10Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.775892 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.775935 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.775951 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.775974 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.775990 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:10 crc kubenswrapper[4834]: E1008 22:24:10.796890 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:10Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.803244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.803293 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.803315 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.803350 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.803375 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:10 crc kubenswrapper[4834]: E1008 22:24:10.820959 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:10Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.832239 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.832306 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.832329 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.832357 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.832377 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:10 crc kubenswrapper[4834]: E1008 22:24:10.851650 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:10Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.856489 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.856536 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.856558 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.856613 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.856630 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:10 crc kubenswrapper[4834]: E1008 22:24:10.873588 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:10Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:10 crc kubenswrapper[4834]: E1008 22:24:10.873738 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.875460 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.875503 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.875513 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.875555 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.875574 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.979398 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.979445 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.979457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.979477 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:10 crc kubenswrapper[4834]: I1008 22:24:10.979490 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:10Z","lastTransitionTime":"2025-10-08T22:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.082324 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.082399 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.082417 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.082448 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.082470 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:11Z","lastTransitionTime":"2025-10-08T22:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.186096 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.186207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.186233 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.186269 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.186295 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:11Z","lastTransitionTime":"2025-10-08T22:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.289324 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.289363 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.289381 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.289406 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.289425 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:11Z","lastTransitionTime":"2025-10-08T22:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.392062 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.392128 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.392192 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.392231 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.392254 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:11Z","lastTransitionTime":"2025-10-08T22:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.495998 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.496073 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.496099 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.496134 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.496198 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:11Z","lastTransitionTime":"2025-10-08T22:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.555555 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.555629 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.555643 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:11 crc kubenswrapper[4834]: E1008 22:24:11.555734 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:11 crc kubenswrapper[4834]: E1008 22:24:11.555838 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:11 crc kubenswrapper[4834]: E1008 22:24:11.555902 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.598334 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.598386 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.598399 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.598420 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.598433 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:11Z","lastTransitionTime":"2025-10-08T22:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.701404 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.701545 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.701565 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.701586 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.701599 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:11Z","lastTransitionTime":"2025-10-08T22:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.804985 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.805040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.805057 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.805088 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.805107 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:11Z","lastTransitionTime":"2025-10-08T22:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.908113 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.908163 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.908172 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.908188 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:11 crc kubenswrapper[4834]: I1008 22:24:11.908199 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:11Z","lastTransitionTime":"2025-10-08T22:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.010985 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.011045 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.011062 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.011088 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.011106 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:12Z","lastTransitionTime":"2025-10-08T22:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.114014 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.114066 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.114084 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.114111 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.114128 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:12Z","lastTransitionTime":"2025-10-08T22:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.217115 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.217488 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.217590 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.217711 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.217808 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:12Z","lastTransitionTime":"2025-10-08T22:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.321195 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.321233 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.321244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.321263 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.321278 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:12Z","lastTransitionTime":"2025-10-08T22:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.424467 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.424513 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.424525 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.424547 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.424562 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:12Z","lastTransitionTime":"2025-10-08T22:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.527449 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.527504 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.527520 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.527545 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.527562 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:12Z","lastTransitionTime":"2025-10-08T22:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.554751 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:12 crc kubenswrapper[4834]: E1008 22:24:12.554965 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.629919 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.629956 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.629966 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.629980 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.629989 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:12Z","lastTransitionTime":"2025-10-08T22:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.732377 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.732453 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.732480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.732537 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.732565 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:12Z","lastTransitionTime":"2025-10-08T22:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.836233 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.836288 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.836303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.836329 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.836346 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:12Z","lastTransitionTime":"2025-10-08T22:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.939859 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.940390 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.940559 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.940711 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:12 crc kubenswrapper[4834]: I1008 22:24:12.940854 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:12Z","lastTransitionTime":"2025-10-08T22:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.044553 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.044823 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.044942 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.045047 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.045135 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:13Z","lastTransitionTime":"2025-10-08T22:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.149293 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.149370 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.149389 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.149444 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.149462 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:13Z","lastTransitionTime":"2025-10-08T22:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.253877 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.253953 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.253976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.254005 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.254026 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:13Z","lastTransitionTime":"2025-10-08T22:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.357550 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.357604 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.357615 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.357635 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.357647 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:13Z","lastTransitionTime":"2025-10-08T22:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.460704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.460752 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.460789 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.460814 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.460827 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:13Z","lastTransitionTime":"2025-10-08T22:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.555508 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.555549 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.555559 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:13 crc kubenswrapper[4834]: E1008 22:24:13.555835 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:13 crc kubenswrapper[4834]: E1008 22:24:13.555959 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:13 crc kubenswrapper[4834]: E1008 22:24:13.556058 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.567950 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.568043 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.568067 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.568104 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.568129 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:13Z","lastTransitionTime":"2025-10-08T22:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.572960 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.585885 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.600123 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.618288 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.652028 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.671005 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.671047 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.671063 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.671120 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.671226 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:13Z","lastTransitionTime":"2025-10-08T22:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.673834 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.696000 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.715419 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.739318 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.758286 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.775659 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.775718 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.775734 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.775768 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.775788 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:13Z","lastTransitionTime":"2025-10-08T22:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.783371 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:05Z\\\",\\\"message\\\":\\\"ry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-wxzpv\\\\nF1008 22:24:05.381663 6503 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 22:24:05.381672 6503 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-wrrs9\\\\nI1008 22:24:05.381571 6503 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1008 22:24:05.381593 6503 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1008 22:24:05.381696 6503 obj_retry.go:303] Retry ob\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:24:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.803099 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.820678 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.839752 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399e10bc-a6d4-4f8a-b205-79fd2639b6ce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaded9d1a4ba5fc97b04ac0968d3abdc4f389864754064554e59a719869ca3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f15a41148009702a77c601901941b8e079f0a12ddf642fa6cb762dc71be69bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609506e03402de713e44a88255d61dc4d41513a9218e8e13a60a287db447bafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.856438 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.874347 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.878800 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.878861 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.878882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.878908 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.878926 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:13Z","lastTransitionTime":"2025-10-08T22:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.895952 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.915201 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:13Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.983650 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.983685 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.983694 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.983725 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:13 crc kubenswrapper[4834]: I1008 22:24:13.983735 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:13Z","lastTransitionTime":"2025-10-08T22:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.086544 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.086598 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.086609 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.086627 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.086638 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:14Z","lastTransitionTime":"2025-10-08T22:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.190029 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.190575 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.190606 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.190656 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.190686 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:14Z","lastTransitionTime":"2025-10-08T22:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.294387 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.294421 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.294429 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.294444 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.294456 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:14Z","lastTransitionTime":"2025-10-08T22:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.396995 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.397035 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.397048 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.397067 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.397082 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:14Z","lastTransitionTime":"2025-10-08T22:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.499818 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.499859 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.499874 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.499897 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.499915 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:14Z","lastTransitionTime":"2025-10-08T22:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.555109 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:14 crc kubenswrapper[4834]: E1008 22:24:14.555295 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.601843 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.601899 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.601917 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.601938 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.601952 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:14Z","lastTransitionTime":"2025-10-08T22:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.705285 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.705335 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.705347 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.705366 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.705379 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:14Z","lastTransitionTime":"2025-10-08T22:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.808092 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.808168 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.808184 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.808209 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.808222 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:14Z","lastTransitionTime":"2025-10-08T22:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.911704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.911771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.911790 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.911815 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:14 crc kubenswrapper[4834]: I1008 22:24:14.911834 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:14Z","lastTransitionTime":"2025-10-08T22:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.014252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.014289 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.014299 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.014314 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.014324 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:15Z","lastTransitionTime":"2025-10-08T22:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.119543 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.119610 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.119636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.119658 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.119674 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:15Z","lastTransitionTime":"2025-10-08T22:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.223061 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.223122 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.223136 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.223175 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.223188 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:15Z","lastTransitionTime":"2025-10-08T22:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.326685 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.326738 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.326750 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.326771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.326785 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:15Z","lastTransitionTime":"2025-10-08T22:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.430206 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.430243 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.430255 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.430274 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.430286 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:15Z","lastTransitionTime":"2025-10-08T22:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.533777 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.533827 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.533838 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.533858 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.533873 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:15Z","lastTransitionTime":"2025-10-08T22:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.555069 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.555206 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:15 crc kubenswrapper[4834]: E1008 22:24:15.555244 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.555451 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:15 crc kubenswrapper[4834]: E1008 22:24:15.555552 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:15 crc kubenswrapper[4834]: E1008 22:24:15.555439 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.636327 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.636383 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.636397 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.636417 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.636432 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:15Z","lastTransitionTime":"2025-10-08T22:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.740485 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.740557 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.740577 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.740605 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.740623 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:15Z","lastTransitionTime":"2025-10-08T22:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.844738 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.844817 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.844841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.844877 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.844900 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:15Z","lastTransitionTime":"2025-10-08T22:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.947835 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.947917 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.947941 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.947969 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:15 crc kubenswrapper[4834]: I1008 22:24:15.947989 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:15Z","lastTransitionTime":"2025-10-08T22:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.051409 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.051462 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.051472 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.051497 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.051509 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:16Z","lastTransitionTime":"2025-10-08T22:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.155602 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.155663 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.155681 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.155704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.155717 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:16Z","lastTransitionTime":"2025-10-08T22:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.259412 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.259476 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.259487 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.259508 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.259537 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:16Z","lastTransitionTime":"2025-10-08T22:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.363096 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.363170 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.363188 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.363209 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.363223 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:16Z","lastTransitionTime":"2025-10-08T22:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.466667 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.466767 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.466787 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.466863 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.466885 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:16Z","lastTransitionTime":"2025-10-08T22:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.555518 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:16 crc kubenswrapper[4834]: E1008 22:24:16.555885 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.569006 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.569053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.569066 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.569085 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.569095 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:16Z","lastTransitionTime":"2025-10-08T22:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.672938 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.672982 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.672991 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.673010 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.673022 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:16Z","lastTransitionTime":"2025-10-08T22:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.776464 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.776501 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.776517 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.776539 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.776556 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:16Z","lastTransitionTime":"2025-10-08T22:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.879836 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.879887 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.879898 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.879918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.879929 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:16Z","lastTransitionTime":"2025-10-08T22:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.982396 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.982436 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.982455 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.982474 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:16 crc kubenswrapper[4834]: I1008 22:24:16.982486 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:16Z","lastTransitionTime":"2025-10-08T22:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.085482 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.085542 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.085552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.085604 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.085620 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:17Z","lastTransitionTime":"2025-10-08T22:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.188442 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.188496 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.188514 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.188539 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.188559 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:17Z","lastTransitionTime":"2025-10-08T22:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.291808 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.291869 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.291881 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.291895 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.291911 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:17Z","lastTransitionTime":"2025-10-08T22:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.394607 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.394661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.394676 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.394698 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.394713 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:17Z","lastTransitionTime":"2025-10-08T22:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.498133 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.498201 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.498213 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.498236 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.498250 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:17Z","lastTransitionTime":"2025-10-08T22:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.555063 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:17 crc kubenswrapper[4834]: E1008 22:24:17.555317 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.555668 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:17 crc kubenswrapper[4834]: E1008 22:24:17.555811 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.556501 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:17 crc kubenswrapper[4834]: E1008 22:24:17.556919 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.600871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.600946 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.600967 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.601001 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.601019 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:17Z","lastTransitionTime":"2025-10-08T22:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.703946 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.704027 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.704041 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.704065 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.704080 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:17Z","lastTransitionTime":"2025-10-08T22:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.807502 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.807555 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.807564 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.807583 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.807597 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:17Z","lastTransitionTime":"2025-10-08T22:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.910616 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.910679 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.910693 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.910716 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:17 crc kubenswrapper[4834]: I1008 22:24:17.910728 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:17Z","lastTransitionTime":"2025-10-08T22:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.014086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.014207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.014234 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.014272 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.014297 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:18Z","lastTransitionTime":"2025-10-08T22:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.117197 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.117287 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.117315 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.117344 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.117363 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:18Z","lastTransitionTime":"2025-10-08T22:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.220576 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.220645 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.220659 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.220685 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.220702 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:18Z","lastTransitionTime":"2025-10-08T22:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.324354 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.324425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.324442 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.324687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.324801 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:18Z","lastTransitionTime":"2025-10-08T22:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.428512 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.428561 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.428571 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.428589 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.428600 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:18Z","lastTransitionTime":"2025-10-08T22:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.532530 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.532599 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.532611 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.532629 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.532641 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:18Z","lastTransitionTime":"2025-10-08T22:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.555215 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:18 crc kubenswrapper[4834]: E1008 22:24:18.555378 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.635426 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.635468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.635476 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.635494 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.635505 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:18Z","lastTransitionTime":"2025-10-08T22:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.738285 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.738583 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.738674 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.738741 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.738796 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:18Z","lastTransitionTime":"2025-10-08T22:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.842758 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.842810 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.842822 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.842845 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.842859 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:18Z","lastTransitionTime":"2025-10-08T22:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.945985 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.946524 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.946751 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.946914 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:18 crc kubenswrapper[4834]: I1008 22:24:18.947049 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:18Z","lastTransitionTime":"2025-10-08T22:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.050661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.050725 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.050746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.050770 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.050791 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:19Z","lastTransitionTime":"2025-10-08T22:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.154744 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.154834 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.154860 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.154893 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.154914 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:19Z","lastTransitionTime":"2025-10-08T22:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.257617 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.257674 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.257690 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.257713 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.257728 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:19Z","lastTransitionTime":"2025-10-08T22:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.361194 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.361239 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.361251 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.361268 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.361278 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:19Z","lastTransitionTime":"2025-10-08T22:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.464002 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.464063 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.464082 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.464103 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.464117 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:19Z","lastTransitionTime":"2025-10-08T22:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.555074 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.555123 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.555193 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:19 crc kubenswrapper[4834]: E1008 22:24:19.555265 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:19 crc kubenswrapper[4834]: E1008 22:24:19.555315 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:19 crc kubenswrapper[4834]: E1008 22:24:19.555419 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.566854 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.566890 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.566917 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.566931 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.566941 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:19Z","lastTransitionTime":"2025-10-08T22:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.669630 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.669882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.669916 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.669937 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.669956 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:19Z","lastTransitionTime":"2025-10-08T22:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.774035 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.774100 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.774132 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.774173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.774187 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:19Z","lastTransitionTime":"2025-10-08T22:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.877251 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.877301 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.877312 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.877333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.877346 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:19Z","lastTransitionTime":"2025-10-08T22:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.979291 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.979332 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.979341 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.979358 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:19 crc kubenswrapper[4834]: I1008 22:24:19.979369 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:19Z","lastTransitionTime":"2025-10-08T22:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.081737 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.081774 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.081790 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.081811 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.081824 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:20Z","lastTransitionTime":"2025-10-08T22:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.183785 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.183863 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.183886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.183921 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.183942 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:20Z","lastTransitionTime":"2025-10-08T22:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.285867 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.285895 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.285904 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.285920 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.285931 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:20Z","lastTransitionTime":"2025-10-08T22:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.388075 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.388113 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.388123 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.388137 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.388163 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:20Z","lastTransitionTime":"2025-10-08T22:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.490753 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.490808 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.490821 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.490843 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.490858 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:20Z","lastTransitionTime":"2025-10-08T22:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.554994 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:20 crc kubenswrapper[4834]: E1008 22:24:20.555082 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.567206 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.592897 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.592950 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.592970 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.592990 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.593003 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:20Z","lastTransitionTime":"2025-10-08T22:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.696194 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.696236 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.696246 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.696268 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.696281 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:20Z","lastTransitionTime":"2025-10-08T22:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.798935 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.799007 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.799022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.799048 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.799069 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:20Z","lastTransitionTime":"2025-10-08T22:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.902186 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.902233 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.902245 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.902265 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.902275 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:20Z","lastTransitionTime":"2025-10-08T22:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.975873 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.975954 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.975981 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.976014 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.976071 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:20Z","lastTransitionTime":"2025-10-08T22:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:20 crc kubenswrapper[4834]: E1008 22:24:20.992571 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:20Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.996648 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.996697 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.996710 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.996732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:20 crc kubenswrapper[4834]: I1008 22:24:20.996747 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:20Z","lastTransitionTime":"2025-10-08T22:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:21 crc kubenswrapper[4834]: E1008 22:24:21.013920 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:21Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.017618 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.017710 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.017736 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.017771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.017798 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:21Z","lastTransitionTime":"2025-10-08T22:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:21 crc kubenswrapper[4834]: E1008 22:24:21.033537 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:21Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.038351 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.038392 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.038405 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.038430 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.038445 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:21Z","lastTransitionTime":"2025-10-08T22:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:21 crc kubenswrapper[4834]: E1008 22:24:21.055516 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:21Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.059125 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.059179 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.059194 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.059213 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.059225 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:21Z","lastTransitionTime":"2025-10-08T22:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:21 crc kubenswrapper[4834]: E1008 22:24:21.075507 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:21Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:21 crc kubenswrapper[4834]: E1008 22:24:21.075648 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.078502 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.078531 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.078541 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.078556 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.078565 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:21Z","lastTransitionTime":"2025-10-08T22:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.180922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.180973 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.180991 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.181019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.181042 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:21Z","lastTransitionTime":"2025-10-08T22:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.284464 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.284522 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.284532 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.284550 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.284562 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:21Z","lastTransitionTime":"2025-10-08T22:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.387982 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.388021 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.388029 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.388046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.388056 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:21Z","lastTransitionTime":"2025-10-08T22:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.490818 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.490868 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.490880 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.490900 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.490915 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:21Z","lastTransitionTime":"2025-10-08T22:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.555129 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.555315 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.555202 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:21 crc kubenswrapper[4834]: E1008 22:24:21.555697 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:21 crc kubenswrapper[4834]: E1008 22:24:21.555830 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:21 crc kubenswrapper[4834]: E1008 22:24:21.556104 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.593575 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.593621 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.593633 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.593654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.593668 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:21Z","lastTransitionTime":"2025-10-08T22:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.696173 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.696216 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.696224 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.696242 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.696253 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:21Z","lastTransitionTime":"2025-10-08T22:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.798601 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.798641 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.798650 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.798667 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.798677 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:21Z","lastTransitionTime":"2025-10-08T22:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.902205 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.902244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.902253 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.902271 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:21 crc kubenswrapper[4834]: I1008 22:24:21.902285 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:21Z","lastTransitionTime":"2025-10-08T22:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.005071 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.005133 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.005166 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.005188 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.005202 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:22Z","lastTransitionTime":"2025-10-08T22:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.107538 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.107606 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.107624 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.107652 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.107672 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:22Z","lastTransitionTime":"2025-10-08T22:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.210337 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.210382 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.210394 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.210417 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.210434 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:22Z","lastTransitionTime":"2025-10-08T22:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.312833 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.312874 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.312883 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.312900 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.312912 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:22Z","lastTransitionTime":"2025-10-08T22:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.415534 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.415598 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.415615 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.415648 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.415687 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:22Z","lastTransitionTime":"2025-10-08T22:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.518714 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.518780 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.518795 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.518820 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.518835 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:22Z","lastTransitionTime":"2025-10-08T22:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.554752 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:22 crc kubenswrapper[4834]: E1008 22:24:22.554980 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.555969 4834 scope.go:117] "RemoveContainer" containerID="deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c" Oct 08 22:24:22 crc kubenswrapper[4834]: E1008 22:24:22.556243 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.621463 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.621505 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.621514 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.621529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.621539 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:22Z","lastTransitionTime":"2025-10-08T22:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.724307 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.724339 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.724348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.724363 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.724375 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:22Z","lastTransitionTime":"2025-10-08T22:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.827036 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.827100 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.827112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.827137 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.827186 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:22Z","lastTransitionTime":"2025-10-08T22:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.931330 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.931405 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.931417 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.931440 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:22 crc kubenswrapper[4834]: I1008 22:24:22.931458 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:22Z","lastTransitionTime":"2025-10-08T22:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.034035 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.034191 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.034215 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.034245 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.034265 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:23Z","lastTransitionTime":"2025-10-08T22:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.137130 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.137198 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.137209 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.137227 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.137244 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:23Z","lastTransitionTime":"2025-10-08T22:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.239476 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.239519 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.239529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.239555 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.239568 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:23Z","lastTransitionTime":"2025-10-08T22:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.341968 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.342019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.342028 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.342045 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.342056 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:23Z","lastTransitionTime":"2025-10-08T22:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.445389 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.445441 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.445450 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.445468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.445482 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:23Z","lastTransitionTime":"2025-10-08T22:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.548219 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.548293 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.548302 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.548316 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.548326 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:23Z","lastTransitionTime":"2025-10-08T22:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.555483 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:23 crc kubenswrapper[4834]: E1008 22:24:23.555725 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.556181 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.556257 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:23 crc kubenswrapper[4834]: E1008 22:24:23.557634 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:23 crc kubenswrapper[4834]: E1008 22:24:23.557635 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.567971 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea401b96-1c03-49cc-9aee-4e2032362e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3648e303d2c1a4d2a6347de8d21cd9f6f9f2951173d15e272cff43b5602b0368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.593559 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.610258 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.632520 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.648970 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.650893 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.650968 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.650981 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.650997 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.651359 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:23Z","lastTransitionTime":"2025-10-08T22:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.669414 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.707099 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.730729 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.754169 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.754211 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.754247 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.754271 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.754286 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:23Z","lastTransitionTime":"2025-10-08T22:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.768968 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:05Z\\\",\\\"message\\\":\\\"ry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-wxzpv\\\\nF1008 22:24:05.381663 6503 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 22:24:05.381672 6503 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-wrrs9\\\\nI1008 22:24:05.381571 6503 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1008 22:24:05.381593 6503 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1008 22:24:05.381696 6503 obj_retry.go:303] Retry ob\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:24:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.789065 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.802133 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399e10bc-a6d4-4f8a-b205-79fd2639b6ce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaded9d1a4ba5fc97b04ac0968d3abdc4f389864754064554e59a719869ca3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f15a41148009702a77c601901941b8e079f0a12ddf642fa6cb762dc71be69bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609506e03402de713e44a88255d61dc4d41513a9218e8e13a60a287db447bafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.814411 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.835597 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.847822 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.857422 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.857469 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.857480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.857498 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.857525 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:23Z","lastTransitionTime":"2025-10-08T22:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.861449 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.876974 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.889065 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.901073 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.912647 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:23Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.960375 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.960421 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.960431 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.960446 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:23 crc kubenswrapper[4834]: I1008 22:24:23.960455 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:23Z","lastTransitionTime":"2025-10-08T22:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.063832 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.063873 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.063883 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.063900 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.063913 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:24Z","lastTransitionTime":"2025-10-08T22:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.167096 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.167164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.167178 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.167199 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.167215 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:24Z","lastTransitionTime":"2025-10-08T22:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.269372 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.269404 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.269413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.269428 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.269438 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:24Z","lastTransitionTime":"2025-10-08T22:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.372434 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.372485 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.372503 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.372535 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.372555 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:24Z","lastTransitionTime":"2025-10-08T22:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.475383 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.475425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.475434 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.475451 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.475462 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:24Z","lastTransitionTime":"2025-10-08T22:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.555098 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:24 crc kubenswrapper[4834]: E1008 22:24:24.555384 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.578747 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.578825 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.578843 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.578879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.578897 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:24Z","lastTransitionTime":"2025-10-08T22:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.681468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.681523 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.681536 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.681564 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.681578 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:24Z","lastTransitionTime":"2025-10-08T22:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.784479 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.784584 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.784605 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.784649 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.784680 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:24Z","lastTransitionTime":"2025-10-08T22:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.887600 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.887669 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.887689 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.887719 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.887738 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:24Z","lastTransitionTime":"2025-10-08T22:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.991526 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.991631 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.991651 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.991684 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:24 crc kubenswrapper[4834]: I1008 22:24:24.991704 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:24Z","lastTransitionTime":"2025-10-08T22:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.094873 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.094922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.094933 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.094950 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.094961 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:25Z","lastTransitionTime":"2025-10-08T22:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.198461 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.198509 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.198520 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.198539 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.198550 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:25Z","lastTransitionTime":"2025-10-08T22:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.301795 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.301851 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.301862 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.301880 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.301892 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:25Z","lastTransitionTime":"2025-10-08T22:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.405279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.405348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.405362 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.405389 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.405405 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:25Z","lastTransitionTime":"2025-10-08T22:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.508638 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.508695 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.508713 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.508735 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.508751 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:25Z","lastTransitionTime":"2025-10-08T22:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.554614 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:25 crc kubenswrapper[4834]: E1008 22:24:25.554809 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.555195 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.555230 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:25 crc kubenswrapper[4834]: E1008 22:24:25.555474 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:25 crc kubenswrapper[4834]: E1008 22:24:25.555621 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.612568 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.612623 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.612636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.612663 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.612675 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:25Z","lastTransitionTime":"2025-10-08T22:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.716279 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.716345 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.716364 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.716395 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.716413 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:25Z","lastTransitionTime":"2025-10-08T22:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.819258 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.819336 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.819354 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.819381 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.819399 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:25Z","lastTransitionTime":"2025-10-08T22:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.922703 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.922782 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.922806 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.922839 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:25 crc kubenswrapper[4834]: I1008 22:24:25.922860 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:25Z","lastTransitionTime":"2025-10-08T22:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.026598 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.026659 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.026677 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.026707 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.026731 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:26Z","lastTransitionTime":"2025-10-08T22:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.039340 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs\") pod \"network-metrics-daemon-g7fd8\" (UID: \"e266421d-b52e-42f9-a7db-88f09ba1c075\") " pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:26 crc kubenswrapper[4834]: E1008 22:24:26.039665 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:24:26 crc kubenswrapper[4834]: E1008 22:24:26.039806 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs podName:e266421d-b52e-42f9-a7db-88f09ba1c075 nodeName:}" failed. No retries permitted until 2025-10-08 22:24:58.039772495 +0000 UTC m=+105.862657281 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs") pod "network-metrics-daemon-g7fd8" (UID: "e266421d-b52e-42f9-a7db-88f09ba1c075") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.130712 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.130801 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.130812 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.130833 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.130846 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:26Z","lastTransitionTime":"2025-10-08T22:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.233902 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.233957 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.233970 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.233987 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.234000 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:26Z","lastTransitionTime":"2025-10-08T22:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.337241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.337331 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.337357 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.337398 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.337423 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:26Z","lastTransitionTime":"2025-10-08T22:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.440570 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.440651 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.440672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.440704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.440729 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:26Z","lastTransitionTime":"2025-10-08T22:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.544072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.544199 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.544228 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.544263 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.544283 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:26Z","lastTransitionTime":"2025-10-08T22:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.554574 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:26 crc kubenswrapper[4834]: E1008 22:24:26.554794 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.648379 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.648435 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.648448 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.648468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.648484 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:26Z","lastTransitionTime":"2025-10-08T22:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.751368 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.751425 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.751433 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.751449 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.751460 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:26Z","lastTransitionTime":"2025-10-08T22:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.854479 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.854521 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.854529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.854542 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.854552 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:26Z","lastTransitionTime":"2025-10-08T22:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.957832 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.957903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.957922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.957954 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:26 crc kubenswrapper[4834]: I1008 22:24:26.957971 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:26Z","lastTransitionTime":"2025-10-08T22:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.059975 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.060028 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.060043 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.060063 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.060077 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:27Z","lastTransitionTime":"2025-10-08T22:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.163467 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.163507 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.163516 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.163535 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.163546 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:27Z","lastTransitionTime":"2025-10-08T22:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.267027 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.267127 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.267144 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.267180 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.267204 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:27Z","lastTransitionTime":"2025-10-08T22:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.370503 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.370561 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.370573 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.370592 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.370606 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:27Z","lastTransitionTime":"2025-10-08T22:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.473495 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.473552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.473575 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.473609 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.473630 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:27Z","lastTransitionTime":"2025-10-08T22:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.560712 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:27 crc kubenswrapper[4834]: E1008 22:24:27.560965 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.561395 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:27 crc kubenswrapper[4834]: E1008 22:24:27.561578 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.562402 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:27 crc kubenswrapper[4834]: E1008 22:24:27.562558 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.577374 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.577428 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.577446 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.577477 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.577495 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:27Z","lastTransitionTime":"2025-10-08T22:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.680740 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.680777 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.680785 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.680802 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.680812 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:27Z","lastTransitionTime":"2025-10-08T22:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.785351 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.785408 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.785424 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.785451 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.785472 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:27Z","lastTransitionTime":"2025-10-08T22:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.888098 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.888205 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.888223 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.888250 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.888270 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:27Z","lastTransitionTime":"2025-10-08T22:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.991819 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.991875 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.991886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.991909 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:27 crc kubenswrapper[4834]: I1008 22:24:27.991921 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:27Z","lastTransitionTime":"2025-10-08T22:24:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.046714 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f297z_b150123b-551e-4c12-afa1-0c651719d3f2/kube-multus/0.log" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.046803 4834 generic.go:334] "Generic (PLEG): container finished" podID="b150123b-551e-4c12-afa1-0c651719d3f2" containerID="3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96" exitCode=1 Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.046865 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f297z" event={"ID":"b150123b-551e-4c12-afa1-0c651719d3f2","Type":"ContainerDied","Data":"3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96"} Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.047652 4834 scope.go:117] "RemoveContainer" containerID="3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.071749 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.092683 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:27Z\\\",\\\"message\\\":\\\"2025-10-08T22:23:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b215d38f-27fb-4107-89de-80243510a2bd\\\\n2025-10-08T22:23:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b215d38f-27fb-4107-89de-80243510a2bd to /host/opt/cni/bin/\\\\n2025-10-08T22:23:42Z [verbose] multus-daemon started\\\\n2025-10-08T22:23:42Z [verbose] Readiness Indicator file check\\\\n2025-10-08T22:24:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.094880 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.094910 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.094921 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.094940 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.094952 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:28Z","lastTransitionTime":"2025-10-08T22:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.106987 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.119384 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399e10bc-a6d4-4f8a-b205-79fd2639b6ce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaded9d1a4ba5fc97b04ac0968d3abdc4f389864754064554e59a719869ca3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f15a41148009702a77c601901941b8e079f0a12ddf642fa6cb762dc71be69bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609506e03402de713e44a88255d61dc4d41513a9218e8e13a60a287db447bafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.138598 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.156591 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.168769 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.182835 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.196944 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.198686 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.198727 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.198743 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.198766 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.198781 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:28Z","lastTransitionTime":"2025-10-08T22:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.216692 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.230899 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.248664 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea401b96-1c03-49cc-9aee-4e2032362e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3648e303d2c1a4d2a6347de8d21cd9f6f9f2951173d15e272cff43b5602b0368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.269529 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.282200 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.301767 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.301819 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.301832 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.301853 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.301865 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:28Z","lastTransitionTime":"2025-10-08T22:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.304941 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:05Z\\\",\\\"message\\\":\\\"ry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-wxzpv\\\\nF1008 22:24:05.381663 6503 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 22:24:05.381672 6503 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-wrrs9\\\\nI1008 22:24:05.381571 6503 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1008 22:24:05.381593 6503 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1008 22:24:05.381696 6503 obj_retry.go:303] Retry ob\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:24:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.326210 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.338180 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.350689 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.368248 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:28Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.404372 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.404421 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.404430 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.404448 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.404458 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:28Z","lastTransitionTime":"2025-10-08T22:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.507481 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.507532 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.507542 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.507563 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.507574 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:28Z","lastTransitionTime":"2025-10-08T22:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.555515 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:28 crc kubenswrapper[4834]: E1008 22:24:28.555727 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.611554 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.612204 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.612318 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.612457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.612585 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:28Z","lastTransitionTime":"2025-10-08T22:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.715733 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.715842 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.715862 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.715892 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.715931 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:28Z","lastTransitionTime":"2025-10-08T22:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.819735 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.820247 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.820420 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.820571 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.820723 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:28Z","lastTransitionTime":"2025-10-08T22:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.924496 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.924563 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.924580 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.924608 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:28 crc kubenswrapper[4834]: I1008 22:24:28.924633 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:28Z","lastTransitionTime":"2025-10-08T22:24:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.027782 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.027872 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.027919 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.027958 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.027983 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:29Z","lastTransitionTime":"2025-10-08T22:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.054696 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f297z_b150123b-551e-4c12-afa1-0c651719d3f2/kube-multus/0.log" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.054794 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f297z" event={"ID":"b150123b-551e-4c12-afa1-0c651719d3f2","Type":"ContainerStarted","Data":"fd1e83915e532b2e5b950d711cd3cec1216f3d984e9d620f71d233e95982792d"} Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.078777 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.101597 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.123008 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.131424 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.131498 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.131517 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.131549 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.131568 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:29Z","lastTransitionTime":"2025-10-08T22:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.156057 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:05Z\\\",\\\"message\\\":\\\"ry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-wxzpv\\\\nF1008 22:24:05.381663 6503 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 22:24:05.381672 6503 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-wrrs9\\\\nI1008 22:24:05.381571 6503 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1008 22:24:05.381593 6503 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1008 22:24:05.381696 6503 obj_retry.go:303] Retry ob\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:24:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.181368 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.201688 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.220430 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399e10bc-a6d4-4f8a-b205-79fd2639b6ce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaded9d1a4ba5fc97b04ac0968d3abdc4f389864754064554e59a719869ca3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f15a41148009702a77c601901941b8e079f0a12ddf642fa6cb762dc71be69bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609506e03402de713e44a88255d61dc4d41513a9218e8e13a60a287db447bafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.235248 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.235316 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.235333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.235384 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.235403 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:29Z","lastTransitionTime":"2025-10-08T22:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.242249 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.263630 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.280647 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1e83915e532b2e5b950d711cd3cec1216f3d984e9d620f71d233e95982792d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:27Z\\\",\\\"message\\\":\\\"2025-10-08T22:23:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b215d38f-27fb-4107-89de-80243510a2bd\\\\n2025-10-08T22:23:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b215d38f-27fb-4107-89de-80243510a2bd to /host/opt/cni/bin/\\\\n2025-10-08T22:23:42Z [verbose] multus-daemon started\\\\n2025-10-08T22:23:42Z [verbose] Readiness Indicator file check\\\\n2025-10-08T22:24:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.294438 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.309512 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.323607 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.337280 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.338834 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.338905 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.338932 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.338969 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.338994 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:29Z","lastTransitionTime":"2025-10-08T22:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.350057 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.361675 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea401b96-1c03-49cc-9aee-4e2032362e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3648e303d2c1a4d2a6347de8d21cd9f6f9f2951173d15e272cff43b5602b0368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.388898 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.402783 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.415683 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:29Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.442001 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.442058 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.442071 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.442094 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.442106 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:29Z","lastTransitionTime":"2025-10-08T22:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.545235 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.545327 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.545349 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.545382 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.545433 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:29Z","lastTransitionTime":"2025-10-08T22:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.555292 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.555337 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:29 crc kubenswrapper[4834]: E1008 22:24:29.555459 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.555359 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:29 crc kubenswrapper[4834]: E1008 22:24:29.555538 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:29 crc kubenswrapper[4834]: E1008 22:24:29.555843 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.649448 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.649527 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.649545 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.649572 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.649590 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:29Z","lastTransitionTime":"2025-10-08T22:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.752774 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.752821 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.752867 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.752886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.752898 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:29Z","lastTransitionTime":"2025-10-08T22:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.856699 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.856846 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.856886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.856925 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.856948 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:29Z","lastTransitionTime":"2025-10-08T22:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.961133 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.961217 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.961230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.961251 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:29 crc kubenswrapper[4834]: I1008 22:24:29.961264 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:29Z","lastTransitionTime":"2025-10-08T22:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.065486 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.065568 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.065596 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.065615 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.065626 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:30Z","lastTransitionTime":"2025-10-08T22:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.168450 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.168544 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.168556 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.168582 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.168596 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:30Z","lastTransitionTime":"2025-10-08T22:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.272229 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.272287 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.272301 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.272326 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.272344 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:30Z","lastTransitionTime":"2025-10-08T22:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.376486 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.376548 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.376563 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.376588 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.376609 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:30Z","lastTransitionTime":"2025-10-08T22:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.482462 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.482548 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.482567 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.482593 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.482609 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:30Z","lastTransitionTime":"2025-10-08T22:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.554550 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:30 crc kubenswrapper[4834]: E1008 22:24:30.554830 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.586964 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.587050 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.587073 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.587109 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.587137 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:30Z","lastTransitionTime":"2025-10-08T22:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.691046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.691312 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.691341 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.691370 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.691389 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:30Z","lastTransitionTime":"2025-10-08T22:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.794046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.794087 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.794096 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.794112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.794125 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:30Z","lastTransitionTime":"2025-10-08T22:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.897906 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.897995 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.898019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.898059 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:30 crc kubenswrapper[4834]: I1008 22:24:30.898089 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:30Z","lastTransitionTime":"2025-10-08T22:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.002293 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.002360 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.002374 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.002450 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.002469 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.105569 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.105643 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.105662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.105692 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.105710 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.209630 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.209700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.209714 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.209738 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.209756 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.282757 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.282820 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.282833 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.282855 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.282872 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: E1008 22:24:31.297583 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:31Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.302125 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.302225 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.302248 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.302276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.302292 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: E1008 22:24:31.318566 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:31Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.322710 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.322754 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.322769 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.322795 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.322812 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: E1008 22:24:31.336524 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:31Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.341882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.341920 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.341932 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.341952 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.341965 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: E1008 22:24:31.354508 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:31Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.359582 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.359637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.359655 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.359679 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.359695 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: E1008 22:24:31.373897 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:31Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:31 crc kubenswrapper[4834]: E1008 22:24:31.374032 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.375952 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.376013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.376029 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.376051 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.376067 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.479819 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.479882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.479902 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.479931 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.479952 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.555474 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.555502 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:31 crc kubenswrapper[4834]: E1008 22:24:31.555736 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.555850 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:31 crc kubenswrapper[4834]: E1008 22:24:31.556021 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:31 crc kubenswrapper[4834]: E1008 22:24:31.556321 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.582607 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.582666 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.582676 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.582696 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.582708 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.686771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.686891 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.686925 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.686959 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.686984 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.791461 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.791518 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.791528 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.791547 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.791560 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.894792 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.894868 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.894889 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.894918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.894938 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.998651 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.998732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.998748 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.998797 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:31 crc kubenswrapper[4834]: I1008 22:24:31.998813 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:31Z","lastTransitionTime":"2025-10-08T22:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.101776 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.101829 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.101839 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.101863 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.101875 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:32Z","lastTransitionTime":"2025-10-08T22:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.205493 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.205571 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.205590 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.205622 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.205648 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:32Z","lastTransitionTime":"2025-10-08T22:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.309534 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.309614 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.309631 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.309663 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.309682 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:32Z","lastTransitionTime":"2025-10-08T22:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.413735 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.413799 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.413816 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.413840 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.413855 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:32Z","lastTransitionTime":"2025-10-08T22:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.517322 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.517393 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.517407 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.517427 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.517441 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:32Z","lastTransitionTime":"2025-10-08T22:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.554688 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:32 crc kubenswrapper[4834]: E1008 22:24:32.554943 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.620389 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.620442 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.620451 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.620473 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.620484 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:32Z","lastTransitionTime":"2025-10-08T22:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.723661 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.723732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.723745 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.723768 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.723782 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:32Z","lastTransitionTime":"2025-10-08T22:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.827398 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.827456 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.827466 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.827485 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.827496 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:32Z","lastTransitionTime":"2025-10-08T22:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.931109 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.931224 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.931245 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.931280 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:32 crc kubenswrapper[4834]: I1008 22:24:32.931301 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:32Z","lastTransitionTime":"2025-10-08T22:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.036564 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.036641 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.036660 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.036687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.036706 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:33Z","lastTransitionTime":"2025-10-08T22:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.141656 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.141711 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.141728 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.141757 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.141776 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:33Z","lastTransitionTime":"2025-10-08T22:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.244638 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.244722 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.244741 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.244777 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.244796 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:33Z","lastTransitionTime":"2025-10-08T22:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.348913 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.349011 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.349034 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.349066 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.349101 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:33Z","lastTransitionTime":"2025-10-08T22:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.452428 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.452511 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.452537 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.452566 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.452585 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:33Z","lastTransitionTime":"2025-10-08T22:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.554578 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.554655 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:33 crc kubenswrapper[4834]: E1008 22:24:33.554833 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.554934 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:33 crc kubenswrapper[4834]: E1008 22:24:33.555206 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:33 crc kubenswrapper[4834]: E1008 22:24:33.555361 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.556046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.556112 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.556137 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.556208 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.556234 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:33Z","lastTransitionTime":"2025-10-08T22:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.576828 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.594346 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.614881 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.634232 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.652928 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea401b96-1c03-49cc-9aee-4e2032362e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3648e303d2c1a4d2a6347de8d21cd9f6f9f2951173d15e272cff43b5602b0368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.659098 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.659200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.659226 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.659256 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.659275 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:33Z","lastTransitionTime":"2025-10-08T22:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.681889 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.702453 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.719310 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.744501 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.759900 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.762356 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.762604 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.762645 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.762697 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.762718 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:33Z","lastTransitionTime":"2025-10-08T22:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.777433 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.795801 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.812915 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.847266 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:05Z\\\",\\\"message\\\":\\\"ry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-wxzpv\\\\nF1008 22:24:05.381663 6503 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 22:24:05.381672 6503 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-wrrs9\\\\nI1008 22:24:05.381571 6503 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1008 22:24:05.381593 6503 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1008 22:24:05.381696 6503 obj_retry.go:303] Retry ob\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:24:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.864863 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.867322 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.867450 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.867513 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.867548 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.867785 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:33Z","lastTransitionTime":"2025-10-08T22:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.883639 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399e10bc-a6d4-4f8a-b205-79fd2639b6ce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaded9d1a4ba5fc97b04ac0968d3abdc4f389864754064554e59a719869ca3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f15a41148009702a77c601901941b8e079f0a12ddf642fa6cb762dc71be69bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609506e03402de713e44a88255d61dc4d41513a9218e8e13a60a287db447bafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.904668 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.923727 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.939888 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1e83915e532b2e5b950d711cd3cec1216f3d984e9d620f71d233e95982792d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:27Z\\\",\\\"message\\\":\\\"2025-10-08T22:23:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b215d38f-27fb-4107-89de-80243510a2bd\\\\n2025-10-08T22:23:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b215d38f-27fb-4107-89de-80243510a2bd to /host/opt/cni/bin/\\\\n2025-10-08T22:23:42Z [verbose] multus-daemon started\\\\n2025-10-08T22:23:42Z [verbose] Readiness Indicator file check\\\\n2025-10-08T22:24:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:33Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.970943 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.971011 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.971031 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.971058 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:33 crc kubenswrapper[4834]: I1008 22:24:33.971076 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:33Z","lastTransitionTime":"2025-10-08T22:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.073234 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.073305 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.073322 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.073350 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.073365 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:34Z","lastTransitionTime":"2025-10-08T22:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.176075 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.176110 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.176119 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.176136 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.176148 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:34Z","lastTransitionTime":"2025-10-08T22:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.279255 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.279335 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.279357 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.279390 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.279413 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:34Z","lastTransitionTime":"2025-10-08T22:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.381983 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.382123 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.382207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.382239 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.382261 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:34Z","lastTransitionTime":"2025-10-08T22:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.485962 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.486044 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.486066 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.486097 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.486121 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:34Z","lastTransitionTime":"2025-10-08T22:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.555695 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:34 crc kubenswrapper[4834]: E1008 22:24:34.555933 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.556012 4834 scope.go:117] "RemoveContainer" containerID="deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.588820 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.588875 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.588888 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.588910 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.588924 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:34Z","lastTransitionTime":"2025-10-08T22:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.692592 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.692654 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.692678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.692702 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.692714 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:34Z","lastTransitionTime":"2025-10-08T22:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.796834 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.796915 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.796935 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.797407 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.797520 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:34Z","lastTransitionTime":"2025-10-08T22:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.900658 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.900704 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.900721 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.900745 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:34 crc kubenswrapper[4834]: I1008 22:24:34.900763 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:34Z","lastTransitionTime":"2025-10-08T22:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.003813 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.003857 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.003866 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.003884 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.003897 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:35Z","lastTransitionTime":"2025-10-08T22:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.076884 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/2.log" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.078806 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerStarted","Data":"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946"} Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.079279 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.094641 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.106407 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.106441 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.106459 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.106478 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.106491 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:35Z","lastTransitionTime":"2025-10-08T22:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.106650 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.120170 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.133469 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.147293 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.168528 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:05Z\\\",\\\"message\\\":\\\"ry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-wxzpv\\\\nF1008 22:24:05.381663 6503 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 22:24:05.381672 6503 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-wrrs9\\\\nI1008 22:24:05.381571 6503 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1008 22:24:05.381593 6503 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1008 22:24:05.381696 6503 obj_retry.go:303] Retry ob\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:24:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.180060 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.196810 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399e10bc-a6d4-4f8a-b205-79fd2639b6ce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaded9d1a4ba5fc97b04ac0968d3abdc4f389864754064554e59a719869ca3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f15a41148009702a77c601901941b8e079f0a12ddf642fa6cb762dc71be69bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609506e03402de713e44a88255d61dc4d41513a9218e8e13a60a287db447bafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.208822 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.208899 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.208948 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.208963 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.208986 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.209001 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:35Z","lastTransitionTime":"2025-10-08T22:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.221121 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.232339 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1e83915e532b2e5b950d711cd3cec1216f3d984e9d620f71d233e95982792d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:27Z\\\",\\\"message\\\":\\\"2025-10-08T22:23:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b215d38f-27fb-4107-89de-80243510a2bd\\\\n2025-10-08T22:23:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b215d38f-27fb-4107-89de-80243510a2bd to /host/opt/cni/bin/\\\\n2025-10-08T22:23:42Z [verbose] multus-daemon started\\\\n2025-10-08T22:23:42Z [verbose] Readiness Indicator file check\\\\n2025-10-08T22:24:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.242656 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.251119 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.262811 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.274971 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.288696 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea401b96-1c03-49cc-9aee-4e2032362e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3648e303d2c1a4d2a6347de8d21cd9f6f9f2951173d15e272cff43b5602b0368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.311882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.311932 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.311943 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.311962 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.311974 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:35Z","lastTransitionTime":"2025-10-08T22:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.318533 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.333812 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.350266 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.415441 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.415514 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.415529 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.415552 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.415593 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:35Z","lastTransitionTime":"2025-10-08T22:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.518287 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.518351 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.518365 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.518385 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.518419 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:35Z","lastTransitionTime":"2025-10-08T22:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.554650 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.554659 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.554738 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:35 crc kubenswrapper[4834]: E1008 22:24:35.554845 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:35 crc kubenswrapper[4834]: E1008 22:24:35.555223 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:35 crc kubenswrapper[4834]: E1008 22:24:35.555213 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.621289 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.621330 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.621339 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.621356 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.621369 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:35Z","lastTransitionTime":"2025-10-08T22:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.724995 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.725105 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.725125 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.725178 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.725197 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:35Z","lastTransitionTime":"2025-10-08T22:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.828620 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.828681 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.828698 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.828726 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.828744 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:35Z","lastTransitionTime":"2025-10-08T22:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.931648 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.931710 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.931732 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.931790 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:35 crc kubenswrapper[4834]: I1008 22:24:35.931808 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:35Z","lastTransitionTime":"2025-10-08T22:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.039639 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.039762 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.039787 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.039819 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.039842 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:36Z","lastTransitionTime":"2025-10-08T22:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.087276 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/3.log" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.088853 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/2.log" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.094639 4834 generic.go:334] "Generic (PLEG): container finished" podID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerID="0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946" exitCode=1 Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.094729 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerDied","Data":"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946"} Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.094802 4834 scope.go:117] "RemoveContainer" containerID="deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.098309 4834 scope.go:117] "RemoveContainer" containerID="0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946" Oct 08 22:24:36 crc kubenswrapper[4834]: E1008 22:24:36.098747 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.121718 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.140504 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea401b96-1c03-49cc-9aee-4e2032362e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3648e303d2c1a4d2a6347de8d21cd9f6f9f2951173d15e272cff43b5602b0368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.144122 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.144230 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.144250 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.144276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.144297 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:36Z","lastTransitionTime":"2025-10-08T22:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.163951 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.181018 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.206422 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deecd185b10b00df854361ff8f442f4ee8a97477919dc4d401af77762fdb0b2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:05Z\\\",\\\"message\\\":\\\"ry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-wxzpv\\\\nF1008 22:24:05.381663 6503 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:05Z is after 2025-08-24T17:21:41Z]\\\\nI1008 22:24:05.381672 6503 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-wrrs9\\\\nI1008 22:24:05.381571 6503 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1008 22:24:05.381593 6503 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1008 22:24:05.381696 6503 obj_retry.go:303] Retry ob\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:24:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:35Z\\\",\\\"message\\\":\\\"chine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1008 22:24:35.421807 6902 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.225364 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.242776 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.247772 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.247876 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.247896 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.247964 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.247986 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:36Z","lastTransitionTime":"2025-10-08T22:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.259530 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.284910 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.297814 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.317460 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1e83915e532b2e5b950d711cd3cec1216f3d984e9d620f71d233e95982792d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:27Z\\\",\\\"message\\\":\\\"2025-10-08T22:23:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b215d38f-27fb-4107-89de-80243510a2bd\\\\n2025-10-08T22:23:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b215d38f-27fb-4107-89de-80243510a2bd to /host/opt/cni/bin/\\\\n2025-10-08T22:23:42Z [verbose] multus-daemon started\\\\n2025-10-08T22:23:42Z [verbose] Readiness Indicator file check\\\\n2025-10-08T22:24:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.330927 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.349903 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399e10bc-a6d4-4f8a-b205-79fd2639b6ce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaded9d1a4ba5fc97b04ac0968d3abdc4f389864754064554e59a719869ca3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f15a41148009702a77c601901941b8e079f0a12ddf642fa6cb762dc71be69bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609506e03402de713e44a88255d61dc4d41513a9218e8e13a60a287db447bafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.351424 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.351486 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.351505 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.351536 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.351559 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:36Z","lastTransitionTime":"2025-10-08T22:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.368668 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.387507 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.405253 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.420573 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.433477 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.448064 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:36Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.454715 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.454775 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.454799 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.454829 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.454850 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:36Z","lastTransitionTime":"2025-10-08T22:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.554901 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:36 crc kubenswrapper[4834]: E1008 22:24:36.555079 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.559271 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.559342 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.559367 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.559395 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.559415 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:36Z","lastTransitionTime":"2025-10-08T22:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.662907 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.662988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.663009 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.663040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.663061 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:36Z","lastTransitionTime":"2025-10-08T22:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.765710 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.765768 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.765784 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.765806 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.765851 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:36Z","lastTransitionTime":"2025-10-08T22:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.868426 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.868461 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.868469 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.868485 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.868494 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:36Z","lastTransitionTime":"2025-10-08T22:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.971488 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.971541 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.971558 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.971584 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:36 crc kubenswrapper[4834]: I1008 22:24:36.971602 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:36Z","lastTransitionTime":"2025-10-08T22:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.075429 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.075476 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.075486 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.075508 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.075519 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:37Z","lastTransitionTime":"2025-10-08T22:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.100511 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/3.log" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.106871 4834 scope.go:117] "RemoveContainer" containerID="0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946" Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.107204 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.126537 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5404afad3f8636b2b557580bc600c93af071b936a1f405550eb4f4796e2b8807\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.142851 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kh4dw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08f0a28c-1bc4-4302-bc15-b2d975ef0b47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f4180b4a1c53a0a2391fb0bb382d16ee82311f996c50939b64cd4761a94a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wg9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kh4dw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.161353 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"732cf917-b3ec-4649-99b0-66653902cfc2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d17e6e2d5acd573d28eb32c81a6dd2775947af89f0c66a1bc3387f1302e1381\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hkrrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f9m4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.180661 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e266421d-b52e-42f9-a7db-88f09ba1c075\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nrqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:54Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-g7fd8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.184512 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.184678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.184793 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.184837 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.184862 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:37Z","lastTransitionTime":"2025-10-08T22:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.198610 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea401b96-1c03-49cc-9aee-4e2032362e25\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3648e303d2c1a4d2a6347de8d21cd9f6f9f2951173d15e272cff43b5602b0368\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c120392ae42eac5393d68092791a01fd36753d6de67a2b5a9956783b9b824fe2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.222011 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e8bd237-900d-4023-9e12-8ba535844092\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f4ae6f72656a5bf9ffe0534ddddd522caa319fd5b7376a170babbe9611766c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2833647d681f7a3e1fc4402840db8c4f65427f4f4a71e19bc476192206f32e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://75c94a0bb8b0f1fd32c5fed9cd818b71ae675ff3ea02cf1817bb84088f09528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1a82e43b75b3baec1aab54cae64fa3df9a2abffea67fd60b4536812bdcc92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b7843539f232342cd16a018d2b0294c1c785abcbf4db4342deaa609d5f6ced1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bd743709642c8aad9c7f1b274e5961fb64cf76408921e4c368e0409f302f79\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2631b68c98d0e823fa4330b7d7b5fab750dcb781cec2935c029d43e7399e9af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://408c785f167cc481966e9825ac96b57e62a2714ddf65e8699304403bcf622014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.239524 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b2d50ed-2670-4513-8b12-e759ed0a3807\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d82425b1483ab18d721627802d1539e14fdb33d20c6a075d71d82b48ff096207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24b83806b235a8782dbf294d52932cc25571ee4737a464484d9fecab836a2db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b6fa84b18ce1ffee80b3f723e3acec1522c23b6aa8357cbeac511448aab7588\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c512f1d83f8aef97d68e9e21197ed8fa398900fe0076066214a987cd2b3e088\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.252386 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1782bdf0cb3f39c9a2ad89f1058fe55813e4b6a16418178fe07c76c3df8af11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77948cf8ea5080c387b6b2ca0d600399375ae9b483e3587afff845c604e87873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.269525 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7944e142-203a-405b-bd09-88f3512170e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54611c6d7ce2e6073f0e5ce15f1aa0637d56edee7547e893b1161e39feeffcde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd1a73af3668b77677a6abec23c494716f243d581440a9cbb861903b036de8e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q9djz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-8svwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.287769 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.287856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.287870 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.287893 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.287907 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:37Z","lastTransitionTime":"2025-10-08T22:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.288066 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28c6dc40-8100-4612-8283-fd7b49e42d0d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbd49157248b17f288749672b539ca38843540e5498f9fe81077a30bb40a6d09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0e7ddcda3887da44405c10085bc69f60b44a251e7d4deb1e60343194de86c45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43cde5b13dd577119482a88706fedbd4db9446f29a2b5d14ff10e9990e23814\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://257fd50c062aa48c2b6ccee2be222efe5a21c681d13dca7d2148ef5a4281b890\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c96c56d9eb4889971261f1963e9d3fdbf8883becff5b0ad5154a33d597b1ee8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"W1008 22:23:17.007782 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1008 22:23:17.008185 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759962197 cert, and key in /tmp/serving-cert-765974509/serving-signer.crt, /tmp/serving-cert-765974509/serving-signer.key\\\\nI1008 22:23:17.301591 1 observer_polling.go:159] Starting file observer\\\\nW1008 22:23:27.304463 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1008 22:23:27.304823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1008 22:23:27.307540 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-765974509/tls.crt::/tmp/serving-cert-765974509/tls.key\\\\\\\"\\\\nI1008 22:23:33.519211 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1008 22:23:33.525269 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1008 22:23:33.525314 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1008 22:23:33.525851 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1008 22:23:33.525960 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nF1008 22:23:33.564055 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c7db46f1df464dc1177342385267e60f3cdf9e6d0e43978a617fc9bc82d4a63\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7037899c6054fe831ad6254b77705b2d860154f41aaffda50bfc273dcdbb1ad2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.304392 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1edeb8fd6526ab2a79dc0054c1431d7571504fcb7baa6b65582ceb00518e5ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.317798 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.338430 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:35Z\\\",\\\"message\\\":\\\"chine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1008 22:24:35.421807 6902 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:35Z is after 2025\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:24:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fw97q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wrrs9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.354094 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ae96bc9-69e0-4f09-b59b-92157a2d5948\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78cf0e77652e73f7ec6e69ea39ddb14321165e356a57d950dca34b5f03cac246\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d234af810791aeab13ae8df780a138723bde081c5b82417777aa572c5ab09553\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a6ba70bd743ae1c57b9ac28270a7e004ffd8ea55f6e00b962870e892e2510a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67e33b83e590be2e04953128b934e6557fb409d4dfd367bdddf34bdb61926eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99dc2fb6aee061f4e669b33e1630dac2d8248d7063441ffa0271c88762b11f2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300cf13f6012781db36fe850aaedf424f140a9c4eb1f15e291a8917461c26ffa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3528ce905d38a0a76a1d60d746158e1d0f601f89a4d6cf6599416ed32761fb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqcx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xsqwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.367053 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"399e10bc-a6d4-4f8a-b205-79fd2639b6ce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaded9d1a4ba5fc97b04ac0968d3abdc4f389864754064554e59a719869ca3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f15a41148009702a77c601901941b8e079f0a12ddf642fa6cb762dc71be69bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://609506e03402de713e44a88255d61dc4d41513a9218e8e13a60a287db447bafb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59edccb880112b15c33aa457b43d77b0bd2bebd9e89b0636e9de1b68cef26ab8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-08T22:23:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.382025 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.390403 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.390457 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.390469 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.390487 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.390499 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:37Z","lastTransitionTime":"2025-10-08T22:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.394480 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:33Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.400570 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.400788 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:41.400771542 +0000 UTC m=+149.223656288 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.409964 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f297z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b150123b-551e-4c12-afa1-0c651719d3f2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd1e83915e532b2e5b950d711cd3cec1216f3d984e9d620f71d233e95982792d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-08T22:24:27Z\\\",\\\"message\\\":\\\"2025-10-08T22:23:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b215d38f-27fb-4107-89de-80243510a2bd\\\\n2025-10-08T22:23:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b215d38f-27fb-4107-89de-80243510a2bd to /host/opt/cni/bin/\\\\n2025-10-08T22:23:42Z [verbose] multus-daemon started\\\\n2025-10-08T22:23:42Z [verbose] Readiness Indicator file check\\\\n2025-10-08T22:24:27Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-08T22:23:40Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnh95\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f297z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.419842 4834 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-wxzpv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"604cbcf3-0c65-49d1-b7af-4ac41fba5bab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-08T22:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53596afb2f842e4c29660ef962694f13c32127372cf68cab32bd532e795d0933\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-08T22:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-08T22:23:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-wxzpv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:37Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.493079 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.493133 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.493164 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.493185 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.493199 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:37Z","lastTransitionTime":"2025-10-08T22:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.501621 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.501657 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.501683 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.501764 4834 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.501789 4834 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.501824 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:25:41.501808574 +0000 UTC m=+149.324693320 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.501840 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-08 22:25:41.501834094 +0000 UTC m=+149.324718840 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.502008 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.502053 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.502073 4834 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.502207 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-08 22:25:41.502170803 +0000 UTC m=+149.325055569 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.555362 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.555362 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.555456 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.555477 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.555534 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.555609 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.594745 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.594790 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.594802 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.594821 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.594836 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:37Z","lastTransitionTime":"2025-10-08T22:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.602439 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.602665 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.602709 4834 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.602734 4834 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:24:37 crc kubenswrapper[4834]: E1008 22:24:37.602848 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-08 22:25:41.602814735 +0000 UTC m=+149.425699641 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.698213 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.698284 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.698303 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.698336 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.698366 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:37Z","lastTransitionTime":"2025-10-08T22:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.801028 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.801066 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.801074 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.801090 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.801100 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:37Z","lastTransitionTime":"2025-10-08T22:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.904577 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.904660 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.904679 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.904716 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:37 crc kubenswrapper[4834]: I1008 22:24:37.904740 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:37Z","lastTransitionTime":"2025-10-08T22:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.007821 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.007870 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.007882 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.007898 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.007910 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:38Z","lastTransitionTime":"2025-10-08T22:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.109637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.109672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.109681 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.109698 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.109710 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:38Z","lastTransitionTime":"2025-10-08T22:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.213211 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.213278 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.213301 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.213333 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.213358 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:38Z","lastTransitionTime":"2025-10-08T22:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.316415 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.316494 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.316518 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.316554 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.316578 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:38Z","lastTransitionTime":"2025-10-08T22:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.419373 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.419424 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.419436 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.419506 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.419518 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:38Z","lastTransitionTime":"2025-10-08T22:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.523268 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.523328 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.523345 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.523373 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.523394 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:38Z","lastTransitionTime":"2025-10-08T22:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.554738 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:38 crc kubenswrapper[4834]: E1008 22:24:38.555022 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.627437 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.627640 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.627736 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.627840 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.627937 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:38Z","lastTransitionTime":"2025-10-08T22:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.730242 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.730286 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.730298 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.730317 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.730331 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:38Z","lastTransitionTime":"2025-10-08T22:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.841557 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.842313 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.842326 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.842350 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.842366 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:38Z","lastTransitionTime":"2025-10-08T22:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.946294 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.946373 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.946397 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.946431 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:38 crc kubenswrapper[4834]: I1008 22:24:38.946455 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:38Z","lastTransitionTime":"2025-10-08T22:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.079537 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.079609 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.079630 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.079658 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.079677 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:39Z","lastTransitionTime":"2025-10-08T22:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.182761 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.182841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.182858 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.182889 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.182911 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:39Z","lastTransitionTime":"2025-10-08T22:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.285881 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.285939 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.285959 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.285988 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.286007 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:39Z","lastTransitionTime":"2025-10-08T22:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.390597 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.390671 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.390691 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.390719 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.390737 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:39Z","lastTransitionTime":"2025-10-08T22:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.494096 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.494221 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.494244 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.494276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.494298 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:39Z","lastTransitionTime":"2025-10-08T22:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.554879 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.554946 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.554949 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:39 crc kubenswrapper[4834]: E1008 22:24:39.555124 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:39 crc kubenswrapper[4834]: E1008 22:24:39.555395 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:39 crc kubenswrapper[4834]: E1008 22:24:39.555494 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.596628 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.596727 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.596745 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.596775 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.596796 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:39Z","lastTransitionTime":"2025-10-08T22:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.700022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.700115 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.700138 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.700205 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.700307 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:39Z","lastTransitionTime":"2025-10-08T22:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.803929 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.804451 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.804473 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.804500 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.804559 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:39Z","lastTransitionTime":"2025-10-08T22:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.907723 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.907771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.907787 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.907809 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:39 crc kubenswrapper[4834]: I1008 22:24:39.907825 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:39Z","lastTransitionTime":"2025-10-08T22:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.010530 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.010559 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.010567 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.010581 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.010591 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:40Z","lastTransitionTime":"2025-10-08T22:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.113815 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.113877 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.113903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.113935 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.113957 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:40Z","lastTransitionTime":"2025-10-08T22:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.218519 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.218559 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.218571 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.218587 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.218596 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:40Z","lastTransitionTime":"2025-10-08T22:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.321500 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.321534 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.321543 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.321558 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.321567 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:40Z","lastTransitionTime":"2025-10-08T22:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.424865 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.424897 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.424906 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.424921 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.424932 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:40Z","lastTransitionTime":"2025-10-08T22:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.527707 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.527746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.527758 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.527778 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.527790 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:40Z","lastTransitionTime":"2025-10-08T22:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.555448 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:40 crc kubenswrapper[4834]: E1008 22:24:40.555580 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.630439 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.630476 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.630487 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.630504 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.630515 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:40Z","lastTransitionTime":"2025-10-08T22:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.733503 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.733538 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.733546 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.733560 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.733570 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:40Z","lastTransitionTime":"2025-10-08T22:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.836411 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.836452 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.836464 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.836484 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.836496 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:40Z","lastTransitionTime":"2025-10-08T22:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.938699 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.938750 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.938762 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.938782 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:40 crc kubenswrapper[4834]: I1008 22:24:40.938799 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:40Z","lastTransitionTime":"2025-10-08T22:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.041289 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.041339 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.041348 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.041365 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.041379 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:41Z","lastTransitionTime":"2025-10-08T22:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.147449 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.147532 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.147543 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.147583 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.147601 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:41Z","lastTransitionTime":"2025-10-08T22:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.250589 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.250628 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.250638 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.250652 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.250663 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:41Z","lastTransitionTime":"2025-10-08T22:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.353582 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.353641 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.353659 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.353682 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.353699 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:41Z","lastTransitionTime":"2025-10-08T22:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.456436 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.456526 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.456547 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.456575 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.456594 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:41Z","lastTransitionTime":"2025-10-08T22:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.497269 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.497339 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.497359 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.497385 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.497417 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:41Z","lastTransitionTime":"2025-10-08T22:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:41 crc kubenswrapper[4834]: E1008 22:24:41.516217 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.522345 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.522404 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.522422 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.522451 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.522478 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:41Z","lastTransitionTime":"2025-10-08T22:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:41 crc kubenswrapper[4834]: E1008 22:24:41.544347 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.549874 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.549923 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.549932 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.549948 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.549959 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:41Z","lastTransitionTime":"2025-10-08T22:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.554715 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.554731 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.554834 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:41 crc kubenswrapper[4834]: E1008 22:24:41.555089 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:41 crc kubenswrapper[4834]: E1008 22:24:41.555349 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:41 crc kubenswrapper[4834]: E1008 22:24:41.555740 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:41 crc kubenswrapper[4834]: E1008 22:24:41.569927 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.575418 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.575487 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.575511 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.575542 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.575565 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:41Z","lastTransitionTime":"2025-10-08T22:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:41 crc kubenswrapper[4834]: E1008 22:24:41.596606 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.601046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.601328 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.601414 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.601438 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.601477 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:41Z","lastTransitionTime":"2025-10-08T22:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:41 crc kubenswrapper[4834]: E1008 22:24:41.627480 4834 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-08T22:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d21aca01-445d-4de9-b847-f69c4d8c7264\\\",\\\"systemUUID\\\":\\\"df65835b-239f-4335-94d8-90a9d75b5252\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-08T22:24:41Z is after 2025-08-24T17:21:41Z" Oct 08 22:24:41 crc kubenswrapper[4834]: E1008 22:24:41.627695 4834 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.630053 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.630087 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.630099 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.630116 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.630126 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:41Z","lastTransitionTime":"2025-10-08T22:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.733509 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.733548 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.733558 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.733577 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.733589 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:41Z","lastTransitionTime":"2025-10-08T22:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.836794 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.836862 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.836885 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.836918 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.836943 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:41Z","lastTransitionTime":"2025-10-08T22:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.940198 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.940237 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.940249 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.940267 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:41 crc kubenswrapper[4834]: I1008 22:24:41.940280 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:41Z","lastTransitionTime":"2025-10-08T22:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.042999 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.043073 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.043100 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.043131 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.043194 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:42Z","lastTransitionTime":"2025-10-08T22:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.146909 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.146976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.146995 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.147022 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.147041 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:42Z","lastTransitionTime":"2025-10-08T22:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.249912 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.249973 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.249992 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.250019 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.250036 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:42Z","lastTransitionTime":"2025-10-08T22:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.352726 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.352802 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.352841 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.352879 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.352902 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:42Z","lastTransitionTime":"2025-10-08T22:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.456252 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.456294 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.456304 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.456323 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.456335 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:42Z","lastTransitionTime":"2025-10-08T22:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.555499 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:42 crc kubenswrapper[4834]: E1008 22:24:42.555700 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.560220 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.560261 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.560272 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.560290 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.560302 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:42Z","lastTransitionTime":"2025-10-08T22:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.663226 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.663299 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.663324 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.663355 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.663378 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:42Z","lastTransitionTime":"2025-10-08T22:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.767003 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.767084 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.767127 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.767196 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.767227 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:42Z","lastTransitionTime":"2025-10-08T22:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.870797 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.870860 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.870884 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.870914 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.870936 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:42Z","lastTransitionTime":"2025-10-08T22:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.973432 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.973485 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.973502 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.973526 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:42 crc kubenswrapper[4834]: I1008 22:24:42.973544 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:42Z","lastTransitionTime":"2025-10-08T22:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.076377 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.076461 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.076480 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.076509 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.076532 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:43Z","lastTransitionTime":"2025-10-08T22:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.180068 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.180180 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.180205 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.180237 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.180261 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:43Z","lastTransitionTime":"2025-10-08T22:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.283048 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.283086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.283104 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.283129 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.283175 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:43Z","lastTransitionTime":"2025-10-08T22:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.387119 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.387223 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.387246 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.387278 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.387299 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:43Z","lastTransitionTime":"2025-10-08T22:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.490580 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.490640 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.490656 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.490682 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.490703 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:43Z","lastTransitionTime":"2025-10-08T22:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.554835 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.554929 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:43 crc kubenswrapper[4834]: E1008 22:24:43.555089 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.555252 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:43 crc kubenswrapper[4834]: E1008 22:24:43.555437 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:43 crc kubenswrapper[4834]: E1008 22:24:43.555625 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.592308 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=41.59219471 podStartE2EDuration="41.59219471s" podCreationTimestamp="2025-10-08 22:24:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:24:43.5921974 +0000 UTC m=+91.415082146" watchObservedRunningTime="2025-10-08 22:24:43.59219471 +0000 UTC m=+91.415079456" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.594608 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.594682 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.594693 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.594710 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.594719 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:43Z","lastTransitionTime":"2025-10-08T22:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.684379 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-f297z" podStartSLOduration=64.684342673 podStartE2EDuration="1m4.684342673s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:24:43.662957752 +0000 UTC m=+91.485842528" watchObservedRunningTime="2025-10-08 22:24:43.684342673 +0000 UTC m=+91.507227419" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.697081 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.697129 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.697160 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.697188 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.697204 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:43Z","lastTransitionTime":"2025-10-08T22:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.698825 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wxzpv" podStartSLOduration=64.698801712 podStartE2EDuration="1m4.698801712s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:24:43.684307393 +0000 UTC m=+91.507192159" watchObservedRunningTime="2025-10-08 22:24:43.698801712 +0000 UTC m=+91.521686468" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.734954 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podStartSLOduration=64.734923571 podStartE2EDuration="1m4.734923571s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:24:43.734672635 +0000 UTC m=+91.557557381" watchObservedRunningTime="2025-10-08 22:24:43.734923571 +0000 UTC m=+91.557808327" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.735312 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kh4dw" podStartSLOduration=64.735302949 podStartE2EDuration="1m4.735302949s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:24:43.719925308 +0000 UTC m=+91.542810074" watchObservedRunningTime="2025-10-08 22:24:43.735302949 +0000 UTC m=+91.558187715" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.799922 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.799960 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.799971 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.799986 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.799997 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:43Z","lastTransitionTime":"2025-10-08T22:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.829735 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=70.829714576 podStartE2EDuration="1m10.829714576s" podCreationTimestamp="2025-10-08 22:23:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:24:43.829109542 +0000 UTC m=+91.651994278" watchObservedRunningTime="2025-10-08 22:24:43.829714576 +0000 UTC m=+91.652599322" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.830130 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=23.830125526 podStartE2EDuration="23.830125526s" podCreationTimestamp="2025-10-08 22:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:24:43.794926739 +0000 UTC m=+91.617811485" watchObservedRunningTime="2025-10-08 22:24:43.830125526 +0000 UTC m=+91.653010262" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.862390 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=66.862366393 podStartE2EDuration="1m6.862366393s" podCreationTimestamp="2025-10-08 22:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:24:43.846779747 +0000 UTC m=+91.669664493" watchObservedRunningTime="2025-10-08 22:24:43.862366393 +0000 UTC m=+91.685251139" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.879192 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=69.879167327 podStartE2EDuration="1m9.879167327s" podCreationTimestamp="2025-10-08 22:23:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:24:43.877781374 +0000 UTC m=+91.700666120" watchObservedRunningTime="2025-10-08 22:24:43.879167327 +0000 UTC m=+91.702052073" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.903079 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.903138 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.903183 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.903210 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.903261 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:43Z","lastTransitionTime":"2025-10-08T22:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.980080 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-8svwx" podStartSLOduration=64.980055155 podStartE2EDuration="1m4.980055155s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:24:43.978924759 +0000 UTC m=+91.801809505" watchObservedRunningTime="2025-10-08 22:24:43.980055155 +0000 UTC m=+91.802939921" Oct 08 22:24:43 crc kubenswrapper[4834]: I1008 22:24:43.980960 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xsqwx" podStartSLOduration=64.980951207 podStartE2EDuration="1m4.980951207s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:24:43.964220984 +0000 UTC m=+91.787105730" watchObservedRunningTime="2025-10-08 22:24:43.980951207 +0000 UTC m=+91.803835973" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.006354 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.006459 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.006481 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.006518 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.006540 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:44Z","lastTransitionTime":"2025-10-08T22:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.108930 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.108967 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.108976 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.108992 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.109003 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:44Z","lastTransitionTime":"2025-10-08T22:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.212040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.212086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.212095 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.212114 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.212125 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:44Z","lastTransitionTime":"2025-10-08T22:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.314698 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.314738 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.314753 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.314772 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.314805 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:44Z","lastTransitionTime":"2025-10-08T22:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.417254 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.417290 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.417302 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.417321 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.417334 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:44Z","lastTransitionTime":"2025-10-08T22:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.520498 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.520584 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.520608 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.520647 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.520672 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:44Z","lastTransitionTime":"2025-10-08T22:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.555036 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:44 crc kubenswrapper[4834]: E1008 22:24:44.555412 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.624804 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.624886 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.624911 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.624947 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.624974 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:44Z","lastTransitionTime":"2025-10-08T22:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.729188 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.729263 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.729282 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.729311 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.729330 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:44Z","lastTransitionTime":"2025-10-08T22:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.832042 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.832105 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.832123 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.832185 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.832206 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:44Z","lastTransitionTime":"2025-10-08T22:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.935991 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.936068 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.936088 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.936117 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:44 crc kubenswrapper[4834]: I1008 22:24:44.936172 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:44Z","lastTransitionTime":"2025-10-08T22:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.039672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.039775 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.039792 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.039820 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.039842 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:45Z","lastTransitionTime":"2025-10-08T22:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.142397 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.142474 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.142491 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.142522 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.142542 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:45Z","lastTransitionTime":"2025-10-08T22:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.245631 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.245700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.245719 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.245755 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.245776 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:45Z","lastTransitionTime":"2025-10-08T22:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.350270 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.350352 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.350378 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.350413 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.350440 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:45Z","lastTransitionTime":"2025-10-08T22:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.454619 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.455046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.455278 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.455500 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.455762 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:45Z","lastTransitionTime":"2025-10-08T22:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.555498 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.555662 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.555732 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:45 crc kubenswrapper[4834]: E1008 22:24:45.555799 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:45 crc kubenswrapper[4834]: E1008 22:24:45.555904 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:45 crc kubenswrapper[4834]: E1008 22:24:45.556114 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.559887 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.559963 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.559986 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.560016 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.560038 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:45Z","lastTransitionTime":"2025-10-08T22:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.663795 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.663874 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.663900 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.663942 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.663967 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:45Z","lastTransitionTime":"2025-10-08T22:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.767391 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.767481 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.767502 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.767535 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.767557 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:45Z","lastTransitionTime":"2025-10-08T22:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.871872 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.871950 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.871969 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.871998 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.872015 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:45Z","lastTransitionTime":"2025-10-08T22:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.974767 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.974888 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.974914 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.974950 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:45 crc kubenswrapper[4834]: I1008 22:24:45.974974 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:45Z","lastTransitionTime":"2025-10-08T22:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.078675 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.078804 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.078856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.078890 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.078910 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:46Z","lastTransitionTime":"2025-10-08T22:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.183315 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.183399 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.183426 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.183462 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.183486 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:46Z","lastTransitionTime":"2025-10-08T22:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.288523 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.288588 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.288607 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.288636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.288655 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:46Z","lastTransitionTime":"2025-10-08T22:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.392856 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.392923 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.392940 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.392967 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.392988 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:46Z","lastTransitionTime":"2025-10-08T22:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.496628 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.496688 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.496705 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.496728 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.496746 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:46Z","lastTransitionTime":"2025-10-08T22:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.555908 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:46 crc kubenswrapper[4834]: E1008 22:24:46.556650 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.600418 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.600496 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.600516 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.600547 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.600570 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:46Z","lastTransitionTime":"2025-10-08T22:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.704791 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.705444 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.705463 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.705493 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.705516 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:46Z","lastTransitionTime":"2025-10-08T22:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.809232 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.809302 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.809320 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.809349 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.809369 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:46Z","lastTransitionTime":"2025-10-08T22:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.913276 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.913349 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.913380 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.913416 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:46 crc kubenswrapper[4834]: I1008 22:24:46.913439 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:46Z","lastTransitionTime":"2025-10-08T22:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.017201 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.017260 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.017277 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.017304 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.017322 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:47Z","lastTransitionTime":"2025-10-08T22:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.121467 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.121531 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.121550 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.121579 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.121598 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:47Z","lastTransitionTime":"2025-10-08T22:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.225524 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.225653 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.225681 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.225724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.225750 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:47Z","lastTransitionTime":"2025-10-08T22:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.328798 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.328874 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.328898 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.328932 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.328962 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:47Z","lastTransitionTime":"2025-10-08T22:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.433071 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.433175 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.433191 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.433213 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.433231 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:47Z","lastTransitionTime":"2025-10-08T22:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.537013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.537086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.537103 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.537136 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.537186 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:47Z","lastTransitionTime":"2025-10-08T22:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.555437 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.555571 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:47 crc kubenswrapper[4834]: E1008 22:24:47.555675 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.555715 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:47 crc kubenswrapper[4834]: E1008 22:24:47.555870 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:47 crc kubenswrapper[4834]: E1008 22:24:47.556078 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.641320 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.641409 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.641432 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.641470 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.641492 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:47Z","lastTransitionTime":"2025-10-08T22:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.744370 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.744449 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.744468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.744500 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.744519 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:47Z","lastTransitionTime":"2025-10-08T22:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.848611 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.848687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.848712 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.848752 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.848775 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:47Z","lastTransitionTime":"2025-10-08T22:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.953051 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.953226 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.953305 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.953341 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:47 crc kubenswrapper[4834]: I1008 22:24:47.953368 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:47Z","lastTransitionTime":"2025-10-08T22:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.057600 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.057670 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.057690 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.057724 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.057745 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:48Z","lastTransitionTime":"2025-10-08T22:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.160447 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.160559 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.160579 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.160612 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.160633 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:48Z","lastTransitionTime":"2025-10-08T22:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.264191 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.264261 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.264281 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.264313 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.264336 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:48Z","lastTransitionTime":"2025-10-08T22:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.367615 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.367706 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.367726 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.367752 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.367771 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:48Z","lastTransitionTime":"2025-10-08T22:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.472046 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.472104 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.472121 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.472187 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.472207 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:48Z","lastTransitionTime":"2025-10-08T22:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.555488 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:48 crc kubenswrapper[4834]: E1008 22:24:48.555756 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.576139 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.576241 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.576270 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.576301 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.576320 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:48Z","lastTransitionTime":"2025-10-08T22:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.680126 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.680255 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.680281 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.680319 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.680345 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:48Z","lastTransitionTime":"2025-10-08T22:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.784273 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.784351 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.784370 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.784402 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.784422 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:48Z","lastTransitionTime":"2025-10-08T22:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.888097 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.888187 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.888207 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.888232 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.888248 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:48Z","lastTransitionTime":"2025-10-08T22:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.991677 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.991770 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.991792 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.991822 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:48 crc kubenswrapper[4834]: I1008 22:24:48.992068 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:48Z","lastTransitionTime":"2025-10-08T22:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.095872 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.095980 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.095997 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.096028 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.096050 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:49Z","lastTransitionTime":"2025-10-08T22:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.200040 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.200102 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.200116 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.200135 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.200175 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:49Z","lastTransitionTime":"2025-10-08T22:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.303941 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.304020 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.304049 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.304082 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.304107 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:49Z","lastTransitionTime":"2025-10-08T22:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.408880 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.408952 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.408975 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.409003 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.409022 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:49Z","lastTransitionTime":"2025-10-08T22:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.512946 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.513025 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.513047 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.513086 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.513112 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:49Z","lastTransitionTime":"2025-10-08T22:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.556703 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.556768 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.556859 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:49 crc kubenswrapper[4834]: E1008 22:24:49.557497 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:49 crc kubenswrapper[4834]: E1008 22:24:49.557894 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:49 crc kubenswrapper[4834]: E1008 22:24:49.558183 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.558334 4834 scope.go:117] "RemoveContainer" containerID="0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946" Oct 08 22:24:49 crc kubenswrapper[4834]: E1008 22:24:49.558624 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.617700 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.617747 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.617756 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.617775 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.617785 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:49Z","lastTransitionTime":"2025-10-08T22:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.722089 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.722284 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.722305 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.722336 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.722355 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:49Z","lastTransitionTime":"2025-10-08T22:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.826200 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.826307 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.826335 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.826374 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.826400 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:49Z","lastTransitionTime":"2025-10-08T22:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.929865 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.929933 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.929960 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.929992 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:49 crc kubenswrapper[4834]: I1008 22:24:49.930012 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:49Z","lastTransitionTime":"2025-10-08T22:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.034106 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.034238 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.034265 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.034298 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.034317 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:50Z","lastTransitionTime":"2025-10-08T22:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.138468 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.138556 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.138573 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.138602 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.138621 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:50Z","lastTransitionTime":"2025-10-08T22:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.242757 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.242852 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.242878 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.242913 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.242934 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:50Z","lastTransitionTime":"2025-10-08T22:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.346740 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.346811 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.346828 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.346858 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.346883 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:50Z","lastTransitionTime":"2025-10-08T22:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.450895 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.450979 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.451004 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.451067 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.451092 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:50Z","lastTransitionTime":"2025-10-08T22:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.554641 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.554820 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.554880 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:50 crc kubenswrapper[4834]: E1008 22:24:50.554874 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.554903 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.554943 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.554967 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:50Z","lastTransitionTime":"2025-10-08T22:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.658596 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.658669 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.658687 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.658717 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.658740 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:50Z","lastTransitionTime":"2025-10-08T22:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.761948 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.762001 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.762010 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.762029 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.762040 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:50Z","lastTransitionTime":"2025-10-08T22:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.871695 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.871805 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.871829 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.871871 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.871892 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:50Z","lastTransitionTime":"2025-10-08T22:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.975528 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.975596 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.975624 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.975662 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:50 crc kubenswrapper[4834]: I1008 22:24:50.975684 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:50Z","lastTransitionTime":"2025-10-08T22:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.078813 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.078885 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.078911 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.078949 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.078974 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:51Z","lastTransitionTime":"2025-10-08T22:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.182636 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.182690 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.182707 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.182726 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.182739 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:51Z","lastTransitionTime":"2025-10-08T22:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.286453 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.286525 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.286549 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.286581 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.286601 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:51Z","lastTransitionTime":"2025-10-08T22:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.389686 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.389749 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.389766 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.389792 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.389810 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:51Z","lastTransitionTime":"2025-10-08T22:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.493972 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.494039 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.494050 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.494072 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.494087 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:51Z","lastTransitionTime":"2025-10-08T22:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.555234 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.555272 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.555255 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:51 crc kubenswrapper[4834]: E1008 22:24:51.555422 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:51 crc kubenswrapper[4834]: E1008 22:24:51.555550 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:51 crc kubenswrapper[4834]: E1008 22:24:51.555835 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.597672 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.597714 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.597727 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.597746 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.597760 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:51Z","lastTransitionTime":"2025-10-08T22:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.700910 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.700968 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.700986 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.701013 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.701033 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:51Z","lastTransitionTime":"2025-10-08T22:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.803605 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.803648 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.803660 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.803678 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.803690 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:51Z","lastTransitionTime":"2025-10-08T22:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.829723 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.829771 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.829780 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.829796 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.829807 4834 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-08T22:24:51Z","lastTransitionTime":"2025-10-08T22:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.880181 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz"] Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.880644 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.884078 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.886050 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.886337 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.886382 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.974491 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e1a602e8-19fe-4382-9104-4abbc413dc69-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.974568 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e1a602e8-19fe-4382-9104-4abbc413dc69-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.974676 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1a602e8-19fe-4382-9104-4abbc413dc69-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.974703 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1a602e8-19fe-4382-9104-4abbc413dc69-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:51 crc kubenswrapper[4834]: I1008 22:24:51.974724 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1a602e8-19fe-4382-9104-4abbc413dc69-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:52 crc kubenswrapper[4834]: I1008 22:24:52.076008 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e1a602e8-19fe-4382-9104-4abbc413dc69-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:52 crc kubenswrapper[4834]: I1008 22:24:52.076081 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e1a602e8-19fe-4382-9104-4abbc413dc69-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:52 crc kubenswrapper[4834]: I1008 22:24:52.076252 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1a602e8-19fe-4382-9104-4abbc413dc69-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:52 crc kubenswrapper[4834]: I1008 22:24:52.076290 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1a602e8-19fe-4382-9104-4abbc413dc69-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:52 crc kubenswrapper[4834]: I1008 22:24:52.076302 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e1a602e8-19fe-4382-9104-4abbc413dc69-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:52 crc kubenswrapper[4834]: I1008 22:24:52.076354 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e1a602e8-19fe-4382-9104-4abbc413dc69-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:52 crc kubenswrapper[4834]: I1008 22:24:52.076328 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1a602e8-19fe-4382-9104-4abbc413dc69-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:52 crc kubenswrapper[4834]: I1008 22:24:52.078407 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e1a602e8-19fe-4382-9104-4abbc413dc69-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:52 crc kubenswrapper[4834]: I1008 22:24:52.089553 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1a602e8-19fe-4382-9104-4abbc413dc69-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:52 crc kubenswrapper[4834]: I1008 22:24:52.099948 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1a602e8-19fe-4382-9104-4abbc413dc69-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5rljz\" (UID: \"e1a602e8-19fe-4382-9104-4abbc413dc69\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:52 crc kubenswrapper[4834]: I1008 22:24:52.202624 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" Oct 08 22:24:52 crc kubenswrapper[4834]: I1008 22:24:52.554981 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:52 crc kubenswrapper[4834]: E1008 22:24:52.555137 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:53 crc kubenswrapper[4834]: I1008 22:24:53.172709 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" event={"ID":"e1a602e8-19fe-4382-9104-4abbc413dc69","Type":"ContainerStarted","Data":"6040dd94e087d50183d9435a28dbeb9b026423a98b6055f0aeee7407480b1f6f"} Oct 08 22:24:53 crc kubenswrapper[4834]: I1008 22:24:53.172835 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" event={"ID":"e1a602e8-19fe-4382-9104-4abbc413dc69","Type":"ContainerStarted","Data":"68fe44337584c3ce7d67047b8faf5019de5840d98d8a925fd0e84e95cfa34142"} Oct 08 22:24:53 crc kubenswrapper[4834]: I1008 22:24:53.199787 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5rljz" podStartSLOduration=74.199756225 podStartE2EDuration="1m14.199756225s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:24:53.199675023 +0000 UTC m=+101.022559839" watchObservedRunningTime="2025-10-08 22:24:53.199756225 +0000 UTC m=+101.022641011" Oct 08 22:24:53 crc kubenswrapper[4834]: I1008 22:24:53.555549 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:53 crc kubenswrapper[4834]: I1008 22:24:53.555721 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:53 crc kubenswrapper[4834]: I1008 22:24:53.558517 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:53 crc kubenswrapper[4834]: E1008 22:24:53.558650 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:53 crc kubenswrapper[4834]: E1008 22:24:53.558791 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:53 crc kubenswrapper[4834]: E1008 22:24:53.559092 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:54 crc kubenswrapper[4834]: I1008 22:24:54.555518 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:54 crc kubenswrapper[4834]: E1008 22:24:54.555708 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:55 crc kubenswrapper[4834]: I1008 22:24:55.555415 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:55 crc kubenswrapper[4834]: I1008 22:24:55.555458 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:55 crc kubenswrapper[4834]: E1008 22:24:55.556033 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:55 crc kubenswrapper[4834]: E1008 22:24:55.556096 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:55 crc kubenswrapper[4834]: I1008 22:24:55.556645 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:55 crc kubenswrapper[4834]: E1008 22:24:55.556859 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:56 crc kubenswrapper[4834]: I1008 22:24:56.555606 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:56 crc kubenswrapper[4834]: E1008 22:24:56.556021 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:57 crc kubenswrapper[4834]: I1008 22:24:57.554793 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:57 crc kubenswrapper[4834]: I1008 22:24:57.554804 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:57 crc kubenswrapper[4834]: E1008 22:24:57.555052 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:24:57 crc kubenswrapper[4834]: I1008 22:24:57.554804 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:57 crc kubenswrapper[4834]: E1008 22:24:57.555222 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:57 crc kubenswrapper[4834]: E1008 22:24:57.555495 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:58 crc kubenswrapper[4834]: I1008 22:24:58.061287 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs\") pod \"network-metrics-daemon-g7fd8\" (UID: \"e266421d-b52e-42f9-a7db-88f09ba1c075\") " pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:58 crc kubenswrapper[4834]: E1008 22:24:58.061578 4834 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:24:58 crc kubenswrapper[4834]: E1008 22:24:58.061684 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs podName:e266421d-b52e-42f9-a7db-88f09ba1c075 nodeName:}" failed. No retries permitted until 2025-10-08 22:26:02.061653171 +0000 UTC m=+169.884537947 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs") pod "network-metrics-daemon-g7fd8" (UID: "e266421d-b52e-42f9-a7db-88f09ba1c075") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 08 22:24:58 crc kubenswrapper[4834]: I1008 22:24:58.554867 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:24:58 crc kubenswrapper[4834]: E1008 22:24:58.555069 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:24:59 crc kubenswrapper[4834]: I1008 22:24:59.554858 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:24:59 crc kubenswrapper[4834]: I1008 22:24:59.554978 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:24:59 crc kubenswrapper[4834]: I1008 22:24:59.554858 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:24:59 crc kubenswrapper[4834]: E1008 22:24:59.555070 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:24:59 crc kubenswrapper[4834]: E1008 22:24:59.555314 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:24:59 crc kubenswrapper[4834]: E1008 22:24:59.555477 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:00 crc kubenswrapper[4834]: I1008 22:25:00.554915 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:00 crc kubenswrapper[4834]: E1008 22:25:00.555194 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:01 crc kubenswrapper[4834]: I1008 22:25:01.555420 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:01 crc kubenswrapper[4834]: I1008 22:25:01.555508 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:01 crc kubenswrapper[4834]: I1008 22:25:01.556087 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:01 crc kubenswrapper[4834]: E1008 22:25:01.556231 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:01 crc kubenswrapper[4834]: I1008 22:25:01.556613 4834 scope.go:117] "RemoveContainer" containerID="0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946" Oct 08 22:25:01 crc kubenswrapper[4834]: E1008 22:25:01.556646 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:01 crc kubenswrapper[4834]: E1008 22:25:01.556744 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:01 crc kubenswrapper[4834]: E1008 22:25:01.556918 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" Oct 08 22:25:02 crc kubenswrapper[4834]: I1008 22:25:02.555357 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:02 crc kubenswrapper[4834]: E1008 22:25:02.555587 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:03 crc kubenswrapper[4834]: I1008 22:25:03.555073 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:03 crc kubenswrapper[4834]: I1008 22:25:03.555114 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:03 crc kubenswrapper[4834]: E1008 22:25:03.557271 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:03 crc kubenswrapper[4834]: I1008 22:25:03.557367 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:03 crc kubenswrapper[4834]: E1008 22:25:03.557535 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:03 crc kubenswrapper[4834]: E1008 22:25:03.557756 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:04 crc kubenswrapper[4834]: I1008 22:25:04.554609 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:04 crc kubenswrapper[4834]: E1008 22:25:04.554969 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:05 crc kubenswrapper[4834]: I1008 22:25:05.555135 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:05 crc kubenswrapper[4834]: I1008 22:25:05.555252 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:05 crc kubenswrapper[4834]: I1008 22:25:05.555273 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:05 crc kubenswrapper[4834]: E1008 22:25:05.555308 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:05 crc kubenswrapper[4834]: E1008 22:25:05.555421 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:05 crc kubenswrapper[4834]: E1008 22:25:05.555552 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:06 crc kubenswrapper[4834]: I1008 22:25:06.554972 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:06 crc kubenswrapper[4834]: E1008 22:25:06.555372 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:07 crc kubenswrapper[4834]: I1008 22:25:07.555409 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:07 crc kubenswrapper[4834]: I1008 22:25:07.555453 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:07 crc kubenswrapper[4834]: I1008 22:25:07.555484 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:07 crc kubenswrapper[4834]: E1008 22:25:07.555631 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:07 crc kubenswrapper[4834]: E1008 22:25:07.555793 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:07 crc kubenswrapper[4834]: E1008 22:25:07.556052 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:08 crc kubenswrapper[4834]: I1008 22:25:08.555434 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:08 crc kubenswrapper[4834]: E1008 22:25:08.555684 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:09 crc kubenswrapper[4834]: I1008 22:25:09.554958 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:09 crc kubenswrapper[4834]: I1008 22:25:09.555104 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:09 crc kubenswrapper[4834]: E1008 22:25:09.555174 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:09 crc kubenswrapper[4834]: E1008 22:25:09.555343 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:09 crc kubenswrapper[4834]: I1008 22:25:09.555524 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:09 crc kubenswrapper[4834]: E1008 22:25:09.555579 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:10 crc kubenswrapper[4834]: I1008 22:25:10.554524 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:10 crc kubenswrapper[4834]: E1008 22:25:10.555429 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:11 crc kubenswrapper[4834]: I1008 22:25:11.554786 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:11 crc kubenswrapper[4834]: I1008 22:25:11.554814 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:11 crc kubenswrapper[4834]: E1008 22:25:11.555000 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:11 crc kubenswrapper[4834]: I1008 22:25:11.555027 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:11 crc kubenswrapper[4834]: E1008 22:25:11.555192 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:11 crc kubenswrapper[4834]: E1008 22:25:11.555521 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:12 crc kubenswrapper[4834]: I1008 22:25:12.555035 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:12 crc kubenswrapper[4834]: E1008 22:25:12.555295 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:13 crc kubenswrapper[4834]: E1008 22:25:13.503732 4834 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 08 22:25:13 crc kubenswrapper[4834]: I1008 22:25:13.555320 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:13 crc kubenswrapper[4834]: I1008 22:25:13.555402 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:13 crc kubenswrapper[4834]: I1008 22:25:13.555602 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:13 crc kubenswrapper[4834]: E1008 22:25:13.557564 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:13 crc kubenswrapper[4834]: E1008 22:25:13.558277 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:13 crc kubenswrapper[4834]: E1008 22:25:13.558697 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:13 crc kubenswrapper[4834]: I1008 22:25:13.559524 4834 scope.go:117] "RemoveContainer" containerID="0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946" Oct 08 22:25:13 crc kubenswrapper[4834]: E1008 22:25:13.559787 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wrrs9_openshift-ovn-kubernetes(f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" Oct 08 22:25:13 crc kubenswrapper[4834]: E1008 22:25:13.690868 4834 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 22:25:14 crc kubenswrapper[4834]: I1008 22:25:14.258831 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f297z_b150123b-551e-4c12-afa1-0c651719d3f2/kube-multus/1.log" Oct 08 22:25:14 crc kubenswrapper[4834]: I1008 22:25:14.260276 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f297z_b150123b-551e-4c12-afa1-0c651719d3f2/kube-multus/0.log" Oct 08 22:25:14 crc kubenswrapper[4834]: I1008 22:25:14.260588 4834 generic.go:334] "Generic (PLEG): container finished" podID="b150123b-551e-4c12-afa1-0c651719d3f2" containerID="fd1e83915e532b2e5b950d711cd3cec1216f3d984e9d620f71d233e95982792d" exitCode=1 Oct 08 22:25:14 crc kubenswrapper[4834]: I1008 22:25:14.260601 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f297z" event={"ID":"b150123b-551e-4c12-afa1-0c651719d3f2","Type":"ContainerDied","Data":"fd1e83915e532b2e5b950d711cd3cec1216f3d984e9d620f71d233e95982792d"} Oct 08 22:25:14 crc kubenswrapper[4834]: I1008 22:25:14.260730 4834 scope.go:117] "RemoveContainer" containerID="3c74b9b834ccb94b1f8773a7396bed9546bfb67a668fffee077968af079c8e96" Oct 08 22:25:14 crc kubenswrapper[4834]: I1008 22:25:14.262127 4834 scope.go:117] "RemoveContainer" containerID="fd1e83915e532b2e5b950d711cd3cec1216f3d984e9d620f71d233e95982792d" Oct 08 22:25:14 crc kubenswrapper[4834]: E1008 22:25:14.268260 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-f297z_openshift-multus(b150123b-551e-4c12-afa1-0c651719d3f2)\"" pod="openshift-multus/multus-f297z" podUID="b150123b-551e-4c12-afa1-0c651719d3f2" Oct 08 22:25:14 crc kubenswrapper[4834]: I1008 22:25:14.555276 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:14 crc kubenswrapper[4834]: E1008 22:25:14.555788 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:15 crc kubenswrapper[4834]: I1008 22:25:15.267244 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f297z_b150123b-551e-4c12-afa1-0c651719d3f2/kube-multus/1.log" Oct 08 22:25:15 crc kubenswrapper[4834]: I1008 22:25:15.554766 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:15 crc kubenswrapper[4834]: I1008 22:25:15.554808 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:15 crc kubenswrapper[4834]: I1008 22:25:15.554808 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:15 crc kubenswrapper[4834]: E1008 22:25:15.555068 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:15 crc kubenswrapper[4834]: E1008 22:25:15.555303 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:15 crc kubenswrapper[4834]: E1008 22:25:15.555564 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:16 crc kubenswrapper[4834]: I1008 22:25:16.555516 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:16 crc kubenswrapper[4834]: E1008 22:25:16.555810 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:17 crc kubenswrapper[4834]: I1008 22:25:17.554675 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:17 crc kubenswrapper[4834]: I1008 22:25:17.555250 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:17 crc kubenswrapper[4834]: E1008 22:25:17.555600 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:17 crc kubenswrapper[4834]: I1008 22:25:17.555940 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:17 crc kubenswrapper[4834]: E1008 22:25:17.556061 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:17 crc kubenswrapper[4834]: E1008 22:25:17.556667 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:18 crc kubenswrapper[4834]: I1008 22:25:18.555125 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:18 crc kubenswrapper[4834]: E1008 22:25:18.555898 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:18 crc kubenswrapper[4834]: E1008 22:25:18.691991 4834 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 22:25:19 crc kubenswrapper[4834]: I1008 22:25:19.554439 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:19 crc kubenswrapper[4834]: E1008 22:25:19.554777 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:19 crc kubenswrapper[4834]: I1008 22:25:19.554855 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:19 crc kubenswrapper[4834]: E1008 22:25:19.554996 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:19 crc kubenswrapper[4834]: I1008 22:25:19.555042 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:19 crc kubenswrapper[4834]: E1008 22:25:19.555203 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:20 crc kubenswrapper[4834]: I1008 22:25:20.555285 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:20 crc kubenswrapper[4834]: E1008 22:25:20.555762 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:21 crc kubenswrapper[4834]: I1008 22:25:21.555280 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:21 crc kubenswrapper[4834]: I1008 22:25:21.555346 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:21 crc kubenswrapper[4834]: I1008 22:25:21.555280 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:21 crc kubenswrapper[4834]: E1008 22:25:21.555531 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:21 crc kubenswrapper[4834]: E1008 22:25:21.555615 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:21 crc kubenswrapper[4834]: E1008 22:25:21.555775 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:22 crc kubenswrapper[4834]: I1008 22:25:22.554595 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:22 crc kubenswrapper[4834]: E1008 22:25:22.554749 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:23 crc kubenswrapper[4834]: I1008 22:25:23.554504 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:23 crc kubenswrapper[4834]: I1008 22:25:23.554504 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:23 crc kubenswrapper[4834]: E1008 22:25:23.555468 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:23 crc kubenswrapper[4834]: I1008 22:25:23.555579 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:23 crc kubenswrapper[4834]: E1008 22:25:23.555745 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:23 crc kubenswrapper[4834]: E1008 22:25:23.555843 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:23 crc kubenswrapper[4834]: E1008 22:25:23.692475 4834 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 22:25:24 crc kubenswrapper[4834]: I1008 22:25:24.555028 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:24 crc kubenswrapper[4834]: E1008 22:25:24.555259 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:25 crc kubenswrapper[4834]: I1008 22:25:25.555173 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:25 crc kubenswrapper[4834]: I1008 22:25:25.555257 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:25 crc kubenswrapper[4834]: E1008 22:25:25.555365 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:25 crc kubenswrapper[4834]: I1008 22:25:25.555403 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:25 crc kubenswrapper[4834]: E1008 22:25:25.555579 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:25 crc kubenswrapper[4834]: E1008 22:25:25.555886 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:26 crc kubenswrapper[4834]: I1008 22:25:26.555457 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:26 crc kubenswrapper[4834]: E1008 22:25:26.555755 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:27 crc kubenswrapper[4834]: I1008 22:25:27.555222 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:27 crc kubenswrapper[4834]: I1008 22:25:27.555342 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:27 crc kubenswrapper[4834]: I1008 22:25:27.555269 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:27 crc kubenswrapper[4834]: E1008 22:25:27.555481 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:27 crc kubenswrapper[4834]: E1008 22:25:27.555699 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:27 crc kubenswrapper[4834]: E1008 22:25:27.555820 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:28 crc kubenswrapper[4834]: I1008 22:25:28.554865 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:28 crc kubenswrapper[4834]: I1008 22:25:28.555392 4834 scope.go:117] "RemoveContainer" containerID="fd1e83915e532b2e5b950d711cd3cec1216f3d984e9d620f71d233e95982792d" Oct 08 22:25:28 crc kubenswrapper[4834]: E1008 22:25:28.555856 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:28 crc kubenswrapper[4834]: I1008 22:25:28.556102 4834 scope.go:117] "RemoveContainer" containerID="0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946" Oct 08 22:25:28 crc kubenswrapper[4834]: E1008 22:25:28.693396 4834 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 22:25:29 crc kubenswrapper[4834]: I1008 22:25:29.317102 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/3.log" Oct 08 22:25:29 crc kubenswrapper[4834]: I1008 22:25:29.319578 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerStarted","Data":"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b"} Oct 08 22:25:29 crc kubenswrapper[4834]: I1008 22:25:29.320040 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:25:29 crc kubenswrapper[4834]: I1008 22:25:29.321393 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f297z_b150123b-551e-4c12-afa1-0c651719d3f2/kube-multus/1.log" Oct 08 22:25:29 crc kubenswrapper[4834]: I1008 22:25:29.321435 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f297z" event={"ID":"b150123b-551e-4c12-afa1-0c651719d3f2","Type":"ContainerStarted","Data":"e8ed8e2d5d9a78bb58ff2b75a8232ded894aa521707893d1b0e1ffa1dc3c4cb0"} Oct 08 22:25:29 crc kubenswrapper[4834]: I1008 22:25:29.349743 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podStartSLOduration=110.349714167 podStartE2EDuration="1m50.349714167s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:29.349346787 +0000 UTC m=+137.172231563" watchObservedRunningTime="2025-10-08 22:25:29.349714167 +0000 UTC m=+137.172598933" Oct 08 22:25:29 crc kubenswrapper[4834]: I1008 22:25:29.555123 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:29 crc kubenswrapper[4834]: I1008 22:25:29.555205 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:29 crc kubenswrapper[4834]: I1008 22:25:29.555171 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:29 crc kubenswrapper[4834]: E1008 22:25:29.555348 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:29 crc kubenswrapper[4834]: E1008 22:25:29.555481 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:29 crc kubenswrapper[4834]: E1008 22:25:29.555554 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:29 crc kubenswrapper[4834]: I1008 22:25:29.767356 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g7fd8"] Oct 08 22:25:30 crc kubenswrapper[4834]: I1008 22:25:30.324691 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:30 crc kubenswrapper[4834]: E1008 22:25:30.324818 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:30 crc kubenswrapper[4834]: I1008 22:25:30.555401 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:30 crc kubenswrapper[4834]: E1008 22:25:30.555733 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:31 crc kubenswrapper[4834]: I1008 22:25:31.555289 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:31 crc kubenswrapper[4834]: I1008 22:25:31.555320 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:31 crc kubenswrapper[4834]: E1008 22:25:31.555508 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:31 crc kubenswrapper[4834]: E1008 22:25:31.555614 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:32 crc kubenswrapper[4834]: I1008 22:25:32.554588 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:32 crc kubenswrapper[4834]: I1008 22:25:32.554679 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:32 crc kubenswrapper[4834]: E1008 22:25:32.554728 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 08 22:25:32 crc kubenswrapper[4834]: E1008 22:25:32.554812 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7fd8" podUID="e266421d-b52e-42f9-a7db-88f09ba1c075" Oct 08 22:25:33 crc kubenswrapper[4834]: I1008 22:25:33.555511 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:33 crc kubenswrapper[4834]: I1008 22:25:33.555555 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:33 crc kubenswrapper[4834]: E1008 22:25:33.558538 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 08 22:25:33 crc kubenswrapper[4834]: E1008 22:25:33.558700 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 08 22:25:34 crc kubenswrapper[4834]: I1008 22:25:34.354933 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:25:34 crc kubenswrapper[4834]: I1008 22:25:34.555396 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:34 crc kubenswrapper[4834]: I1008 22:25:34.555618 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:25:34 crc kubenswrapper[4834]: I1008 22:25:34.559482 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 08 22:25:34 crc kubenswrapper[4834]: I1008 22:25:34.559965 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 08 22:25:34 crc kubenswrapper[4834]: I1008 22:25:34.560614 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 08 22:25:34 crc kubenswrapper[4834]: I1008 22:25:34.560881 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 08 22:25:35 crc kubenswrapper[4834]: I1008 22:25:35.555196 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:35 crc kubenswrapper[4834]: I1008 22:25:35.555249 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:35 crc kubenswrapper[4834]: I1008 22:25:35.558699 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 08 22:25:35 crc kubenswrapper[4834]: I1008 22:25:35.558783 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 08 22:25:41 crc kubenswrapper[4834]: I1008 22:25:41.483907 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:41 crc kubenswrapper[4834]: E1008 22:25:41.484113 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:27:43.484074081 +0000 UTC m=+271.306958847 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:41 crc kubenswrapper[4834]: I1008 22:25:41.585662 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:41 crc kubenswrapper[4834]: I1008 22:25:41.585709 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:41 crc kubenswrapper[4834]: I1008 22:25:41.585751 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:41 crc kubenswrapper[4834]: I1008 22:25:41.593679 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:41 crc kubenswrapper[4834]: I1008 22:25:41.595605 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:41 crc kubenswrapper[4834]: I1008 22:25:41.611284 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:41 crc kubenswrapper[4834]: I1008 22:25:41.686399 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:41 crc kubenswrapper[4834]: I1008 22:25:41.692397 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:41 crc kubenswrapper[4834]: I1008 22:25:41.777085 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 08 22:25:41 crc kubenswrapper[4834]: I1008 22:25:41.878970 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 08 22:25:41 crc kubenswrapper[4834]: I1008 22:25:41.893456 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:42 crc kubenswrapper[4834]: W1008 22:25:42.182169 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-17038ed5cd74a89bf2dae1c407d75e4cec8489b3ff6d36b8cbb4da7c06d56e76 WatchSource:0}: Error finding container 17038ed5cd74a89bf2dae1c407d75e4cec8489b3ff6d36b8cbb4da7c06d56e76: Status 404 returned error can't find the container with id 17038ed5cd74a89bf2dae1c407d75e4cec8489b3ff6d36b8cbb4da7c06d56e76 Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.369419 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e1507aadeb68e29ce37fdd0eb8855c5ab2ca0f2e4d57a8b57f1536aa0b06fa09"} Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.370959 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"17038ed5cd74a89bf2dae1c407d75e4cec8489b3ff6d36b8cbb4da7c06d56e76"} Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.372773 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2afba38745c50cb7a67480cd6cbcd4c5bd00baea8070e2fd4fbc8d3af17a8477"} Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.389637 4834 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.452427 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.453072 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.453287 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.453908 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.453930 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b2cdc"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.454621 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.456480 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.456645 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.456739 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nn8k8"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.457032 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.457481 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.458225 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.458243 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.458754 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.458822 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.458826 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.458781 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.459620 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.460019 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.460171 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.461300 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.463952 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.464363 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.464813 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.465135 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.465327 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.465823 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.466019 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.466084 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.466022 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.482874 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2vxck"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.483698 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.483941 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.484523 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.485023 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.489500 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.504903 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-client-ca\") pod \"route-controller-manager-6576b87f9c-mn7cl\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.504948 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbqtl\" (UniqueName: \"kubernetes.io/projected/88947829-240b-4ebf-9125-c03a9e4fd9df-kube-api-access-vbqtl\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.504968 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-config\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.504987 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-serving-cert\") pod \"route-controller-manager-6576b87f9c-mn7cl\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.505003 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.505019 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-config\") pod \"route-controller-manager-6576b87f9c-mn7cl\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.505036 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrftz\" (UniqueName: \"kubernetes.io/projected/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-kube-api-access-qrftz\") pod \"route-controller-manager-6576b87f9c-mn7cl\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.505051 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88947829-240b-4ebf-9125-c03a9e4fd9df-serving-cert\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.505067 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-298bl\" (UniqueName: \"kubernetes.io/projected/86ddb047-3ac1-4136-a4a2-37469c06fa91-kube-api-access-298bl\") pod \"cluster-samples-operator-665b6dd947-cdxtt\" (UID: \"86ddb047-3ac1-4136-a4a2-37469c06fa91\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.505094 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-client-ca\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.505111 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/86ddb047-3ac1-4136-a4a2-37469c06fa91-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cdxtt\" (UID: \"86ddb047-3ac1-4136-a4a2-37469c06fa91\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.506584 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.506762 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.506906 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.506901 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.507107 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.508021 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.508174 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.509771 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qqn48"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.510610 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4v5sm"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.511029 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.511414 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.512847 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.513250 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.513467 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.513569 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.513629 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.513905 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.514163 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.514204 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.514259 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.514271 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.514324 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-66fwq"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.514186 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.514872 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.515339 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.515533 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.516370 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-x69p6"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.519736 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.520984 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.523439 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.524052 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.529518 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-86m4g"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.530528 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-lbmrk"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.530729 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.530949 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.531007 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.531379 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-86m4g" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.532352 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.532773 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.533098 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.533220 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.533423 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.533522 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.533611 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.533722 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.533735 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.534072 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.534116 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.534279 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.534328 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.534415 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.534446 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.534530 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.534546 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.534630 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.534724 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.534811 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.534887 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.534963 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.535042 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.535124 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.535272 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.535365 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.535812 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.536065 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.536196 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.536084 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.538178 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.538567 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.538672 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.538782 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.538891 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.539110 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.539239 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.539463 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.542450 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.543068 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nn8k8"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.543193 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.544276 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.544465 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.676163 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.676515 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b2cdc"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.678016 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.686657 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-config\") pod \"route-controller-manager-6576b87f9c-mn7cl\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.686713 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrftz\" (UniqueName: \"kubernetes.io/projected/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-kube-api-access-qrftz\") pod \"route-controller-manager-6576b87f9c-mn7cl\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.686733 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88947829-240b-4ebf-9125-c03a9e4fd9df-serving-cert\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.686750 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-298bl\" (UniqueName: \"kubernetes.io/projected/86ddb047-3ac1-4136-a4a2-37469c06fa91-kube-api-access-298bl\") pod \"cluster-samples-operator-665b6dd947-cdxtt\" (UID: \"86ddb047-3ac1-4136-a4a2-37469c06fa91\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.686786 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-serving-cert\") pod \"route-controller-manager-6576b87f9c-mn7cl\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.686801 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.686821 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea3d207c-4b27-4342-964f-80f5039fa7f6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhm6q\" (UID: \"ea3d207c-4b27-4342-964f-80f5039fa7f6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.686847 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-client-ca\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.686865 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/86ddb047-3ac1-4136-a4a2-37469c06fa91-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cdxtt\" (UID: \"86ddb047-3ac1-4136-a4a2-37469c06fa91\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.686882 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-client-ca\") pod \"route-controller-manager-6576b87f9c-mn7cl\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.686899 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbqtl\" (UniqueName: \"kubernetes.io/projected/88947829-240b-4ebf-9125-c03a9e4fd9df-kube-api-access-vbqtl\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.686914 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-config\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.689057 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-client-ca\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.689722 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.689777 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-client-ca\") pod \"route-controller-manager-6576b87f9c-mn7cl\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.690840 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-config\") pod \"route-controller-manager-6576b87f9c-mn7cl\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.693294 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.693980 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-serving-cert\") pod \"route-controller-manager-6576b87f9c-mn7cl\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.696550 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.696735 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.696979 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.697055 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/86ddb047-3ac1-4136-a4a2-37469c06fa91-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cdxtt\" (UID: \"86ddb047-3ac1-4136-a4a2-37469c06fa91\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.697277 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.697324 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.697612 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.697731 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.697730 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.697852 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.701383 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88947829-240b-4ebf-9125-c03a9e4fd9df-serving-cert\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.702022 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.702880 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.703793 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.708807 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.708880 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.712415 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.713259 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lbmrk"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.713301 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2vxck"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.713310 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4v5sm"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.713620 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.713755 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.718491 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.719235 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.721667 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.728058 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-66fwq"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.728825 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-config\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.729085 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.731491 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.731547 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.732036 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-298bl\" (UniqueName: \"kubernetes.io/projected/86ddb047-3ac1-4136-a4a2-37469c06fa91-kube-api-access-298bl\") pod \"cluster-samples-operator-665b6dd947-cdxtt\" (UID: \"86ddb047-3ac1-4136-a4a2-37469c06fa91\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.747907 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrftz\" (UniqueName: \"kubernetes.io/projected/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-kube-api-access-qrftz\") pod \"route-controller-manager-6576b87f9c-mn7cl\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.749089 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbqtl\" (UniqueName: \"kubernetes.io/projected/88947829-240b-4ebf-9125-c03a9e4fd9df-kube-api-access-vbqtl\") pod \"controller-manager-879f6c89f-b2cdc\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.754647 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-86m4g"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.759206 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.759446 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qqn48"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.761198 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-x69p6"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.766626 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.773434 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rdz6s"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.774772 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.776536 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.777060 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.777369 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.778622 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9n7fb"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.778771 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.778903 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.796291 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.797261 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d832b5a7-1a3a-4da8-8524-6bdf31d4fd94-trusted-ca\") pod \"console-operator-58897d9998-x69p6\" (UID: \"d832b5a7-1a3a-4da8-8524-6bdf31d4fd94\") " pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.797397 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b42878da-c62e-425b-b147-57836dcd9a2d-config\") pod \"machine-api-operator-5694c8668f-66fwq\" (UID: \"b42878da-c62e-425b-b147-57836dcd9a2d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.797472 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.797592 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfttn\" (UniqueName: \"kubernetes.io/projected/b42878da-c62e-425b-b147-57836dcd9a2d-kube-api-access-qfttn\") pod \"machine-api-operator-5694c8668f-66fwq\" (UID: \"b42878da-c62e-425b-b147-57836dcd9a2d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.797709 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tzsd\" (UniqueName: \"kubernetes.io/projected/ec174061-c43b-4749-8684-500ce8aaea32-kube-api-access-7tzsd\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.797806 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v98v7\" (UniqueName: \"kubernetes.io/projected/7f790722-e671-49a0-9d77-ead026007180-kube-api-access-v98v7\") pod \"openshift-apiserver-operator-796bbdcf4f-mdd58\" (UID: \"7f790722-e671-49a0-9d77-ead026007180\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.797897 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/80186c7b-b4a2-4480-a605-18eafe6067fb-encryption-config\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.797988 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-etcd-serving-ca\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.798067 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.798169 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.798279 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec174061-c43b-4749-8684-500ce8aaea32-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.798381 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/936e07e4-586f-4092-ad38-4e49512485a5-serving-cert\") pod \"openshift-config-operator-7777fb866f-2vxck\" (UID: \"936e07e4-586f-4092-ad38-4e49512485a5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.799454 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gxfrf"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.801638 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.802826 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.803258 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.803282 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.803476 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.804682 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc4vm\" (UniqueName: \"kubernetes.io/projected/936e07e4-586f-4092-ad38-4e49512485a5-kube-api-access-pc4vm\") pod \"openshift-config-operator-7777fb866f-2vxck\" (UID: \"936e07e4-586f-4092-ad38-4e49512485a5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.819286 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.805439 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.816326 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.819364 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-config\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.840708 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5jgwz"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.841328 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.841992 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.842648 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.842779 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.842990 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5jgwz" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.843256 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.845304 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.845655 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.845990 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.846508 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/80186c7b-b4a2-4480-a605-18eafe6067fb-etcd-client\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.846595 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea3d207c-4b27-4342-964f-80f5039fa7f6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhm6q\" (UID: \"ea3d207c-4b27-4342-964f-80f5039fa7f6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.846685 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/535dd049-2145-4ecc-8165-1cad8d1e1ef1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-d7fsf\" (UID: \"535dd049-2145-4ecc-8165-1cad8d1e1ef1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.846785 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-audit\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.846875 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-image-import-ca\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.847398 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.848307 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec174061-c43b-4749-8684-500ce8aaea32-serving-cert\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.848400 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/535dd049-2145-4ecc-8165-1cad8d1e1ef1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-d7fsf\" (UID: \"535dd049-2145-4ecc-8165-1cad8d1e1ef1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.848496 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f790722-e671-49a0-9d77-ead026007180-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mdd58\" (UID: \"7f790722-e671-49a0-9d77-ead026007180\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.848643 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-config\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.848945 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea3d207c-4b27-4342-964f-80f5039fa7f6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhm6q\" (UID: \"ea3d207c-4b27-4342-964f-80f5039fa7f6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.849568 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-serving-cert\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.852612 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.853809 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854054 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-j595w"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854257 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-audit-policies\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854312 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz5m7\" (UniqueName: \"kubernetes.io/projected/80186c7b-b4a2-4480-a605-18eafe6067fb-kube-api-access-dz5m7\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854331 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d832b5a7-1a3a-4da8-8524-6bdf31d4fd94-serving-cert\") pod \"console-operator-58897d9998-x69p6\" (UID: \"d832b5a7-1a3a-4da8-8524-6bdf31d4fd94\") " pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854368 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bztg6\" (UniqueName: \"kubernetes.io/projected/535dd049-2145-4ecc-8165-1cad8d1e1ef1-kube-api-access-bztg6\") pod \"cluster-image-registry-operator-dc59b4c8b-d7fsf\" (UID: \"535dd049-2145-4ecc-8165-1cad8d1e1ef1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854397 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-oauth-serving-cert\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854420 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95df5fe3-a6c2-4af6-94af-5e0540baf2a6-config\") pod \"machine-approver-56656f9798-4gjq8\" (UID: \"95df5fe3-a6c2-4af6-94af-5e0540baf2a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854439 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854457 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b42878da-c62e-425b-b147-57836dcd9a2d-images\") pod \"machine-api-operator-5694c8668f-66fwq\" (UID: \"b42878da-c62e-425b-b147-57836dcd9a2d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854473 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j8jv\" (UniqueName: \"kubernetes.io/projected/d832b5a7-1a3a-4da8-8524-6bdf31d4fd94-kube-api-access-2j8jv\") pod \"console-operator-58897d9998-x69p6\" (UID: \"d832b5a7-1a3a-4da8-8524-6bdf31d4fd94\") " pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854487 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80186c7b-b4a2-4480-a605-18eafe6067fb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854512 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/95df5fe3-a6c2-4af6-94af-5e0540baf2a6-machine-approver-tls\") pod \"machine-approver-56656f9798-4gjq8\" (UID: \"95df5fe3-a6c2-4af6-94af-5e0540baf2a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854530 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5prd\" (UniqueName: \"kubernetes.io/projected/084bae12-5db3-49bc-b703-a694b692c215-kube-api-access-s5prd\") pod \"downloads-7954f5f757-86m4g\" (UID: \"084bae12-5db3-49bc-b703-a694b692c215\") " pod="openshift-console/downloads-7954f5f757-86m4g" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854546 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5496ae0a-5098-49eb-9a39-82e4d0c584bf-audit-dir\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854562 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854577 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/80186c7b-b4a2-4480-a605-18eafe6067fb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854593 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/80186c7b-b4a2-4480-a605-18eafe6067fb-audit-dir\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854608 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-service-ca\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854623 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7lf9\" (UniqueName: \"kubernetes.io/projected/5496ae0a-5098-49eb-9a39-82e4d0c584bf-kube-api-access-h7lf9\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854640 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/80186c7b-b4a2-4480-a605-18eafe6067fb-audit-policies\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854697 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854724 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854744 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77239a2f-ad60-4314-8d76-87449351907a-audit-dir\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854772 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77239a2f-ad60-4314-8d76-87449351907a-serving-cert\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854794 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77239a2f-ad60-4314-8d76-87449351907a-encryption-config\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854821 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/535dd049-2145-4ecc-8165-1cad8d1e1ef1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-d7fsf\" (UID: \"535dd049-2145-4ecc-8165-1cad8d1e1ef1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854837 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80186c7b-b4a2-4480-a605-18eafe6067fb-serving-cert\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854840 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.854854 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk447\" (UniqueName: \"kubernetes.io/projected/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-kube-api-access-bk447\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.855095 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec174061-c43b-4749-8684-500ce8aaea32-service-ca-bundle\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.855124 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhfg2\" (UniqueName: \"kubernetes.io/projected/95df5fe3-a6c2-4af6-94af-5e0540baf2a6-kube-api-access-xhfg2\") pod \"machine-approver-56656f9798-4gjq8\" (UID: \"95df5fe3-a6c2-4af6-94af-5e0540baf2a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.855185 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.855226 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.855250 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-trusted-ca-bundle\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.855258 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9lpgw"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.855298 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77239a2f-ad60-4314-8d76-87449351907a-etcd-client\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.855327 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxfls\" (UniqueName: \"kubernetes.io/projected/ea3d207c-4b27-4342-964f-80f5039fa7f6-kube-api-access-gxfls\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhm6q\" (UID: \"ea3d207c-4b27-4342-964f-80f5039fa7f6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.855355 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b42878da-c62e-425b-b147-57836dcd9a2d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-66fwq\" (UID: \"b42878da-c62e-425b-b147-57836dcd9a2d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.855381 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95df5fe3-a6c2-4af6-94af-5e0540baf2a6-auth-proxy-config\") pod \"machine-approver-56656f9798-4gjq8\" (UID: \"95df5fe3-a6c2-4af6-94af-5e0540baf2a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.855408 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.855491 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea3d207c-4b27-4342-964f-80f5039fa7f6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhm6q\" (UID: \"ea3d207c-4b27-4342-964f-80f5039fa7f6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.855784 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.856025 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9lpgw" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.857002 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.857672 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4xwnj"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.858166 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.858958 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.859094 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sr6sl"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.859761 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-sr6sl" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.861427 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.862055 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xsxk5"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.862590 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.863187 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.863597 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.863954 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.865162 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-m5dj6"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.865769 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sfxxw"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.866034 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4xwnj" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.866331 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sfxxw" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.866883 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-m5dj6" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.866674 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-r2fr4"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.867789 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-r2fr4" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.868871 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.869107 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87bwd\" (UniqueName: \"kubernetes.io/projected/77239a2f-ad60-4314-8d76-87449351907a-kube-api-access-87bwd\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.869179 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.869259 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f790722-e671-49a0-9d77-ead026007180-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mdd58\" (UID: \"7f790722-e671-49a0-9d77-ead026007180\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.869284 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77239a2f-ad60-4314-8d76-87449351907a-node-pullsecrets\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.869302 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d832b5a7-1a3a-4da8-8524-6bdf31d4fd94-config\") pod \"console-operator-58897d9998-x69p6\" (UID: \"d832b5a7-1a3a-4da8-8524-6bdf31d4fd94\") " pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.869577 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-oauth-config\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.869650 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/936e07e4-586f-4092-ad38-4e49512485a5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2vxck\" (UID: \"936e07e4-586f-4092-ad38-4e49512485a5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.869675 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.869730 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec174061-c43b-4749-8684-500ce8aaea32-config\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.869847 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.869964 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.871092 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.872309 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9n7fb"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.873901 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.876006 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.876267 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-j64fd"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.876852 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-j64fd" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.877307 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.879322 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rdz6s"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.880604 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.882525 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.884375 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.886265 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.888401 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.889986 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.892289 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sr6sl"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.894251 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sfxxw"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.895693 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.897292 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9lpgw"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.901529 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-r2fr4"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.902112 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4xwnj"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.905983 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.921212 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5jgwz"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.926784 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.930335 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gxfrf"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.933424 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.935652 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.938387 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.943711 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xsxk5"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.946123 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-m5dj6"] Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.957473 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971366 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-plugins-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971425 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-csi-data-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971465 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971495 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971517 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77239a2f-ad60-4314-8d76-87449351907a-audit-dir\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971538 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-socket-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971564 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-etcd-client\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971586 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77239a2f-ad60-4314-8d76-87449351907a-serving-cert\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971604 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77239a2f-ad60-4314-8d76-87449351907a-encryption-config\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971626 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/535dd049-2145-4ecc-8165-1cad8d1e1ef1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-d7fsf\" (UID: \"535dd049-2145-4ecc-8165-1cad8d1e1ef1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971648 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80186c7b-b4a2-4480-a605-18eafe6067fb-serving-cert\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971666 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-mountpoint-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971686 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk447\" (UniqueName: \"kubernetes.io/projected/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-kube-api-access-bk447\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971704 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-serving-cert\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.971784 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1feb4b4-f628-4b37-9cb9-0e01269f5825-config\") pod \"kube-controller-manager-operator-78b949d7b-kdqjq\" (UID: \"b1feb4b4-f628-4b37-9cb9-0e01269f5825\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.972854 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77239a2f-ad60-4314-8d76-87449351907a-audit-dir\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.975947 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec174061-c43b-4749-8684-500ce8aaea32-service-ca-bundle\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.976208 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhfg2\" (UniqueName: \"kubernetes.io/projected/95df5fe3-a6c2-4af6-94af-5e0540baf2a6-kube-api-access-xhfg2\") pod \"machine-approver-56656f9798-4gjq8\" (UID: \"95df5fe3-a6c2-4af6-94af-5e0540baf2a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.976256 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvsrx\" (UniqueName: \"kubernetes.io/projected/c9182dd7-0f3e-4916-b53e-0a9de9fffa89-kube-api-access-tvsrx\") pod \"ingress-operator-5b745b69d9-xh4x8\" (UID: \"c9182dd7-0f3e-4916-b53e-0a9de9fffa89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.977102 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e8cf764-332f-4e89-ad19-6dad90f92692-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-l5mhx\" (UID: \"7e8cf764-332f-4e89-ad19-6dad90f92692\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.977178 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.977229 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.977845 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-trusted-ca-bundle\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.978134 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.978396 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec174061-c43b-4749-8684-500ce8aaea32-service-ca-bundle\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.978907 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77239a2f-ad60-4314-8d76-87449351907a-etcd-client\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.978944 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxfls\" (UniqueName: \"kubernetes.io/projected/ea3d207c-4b27-4342-964f-80f5039fa7f6-kube-api-access-gxfls\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhm6q\" (UID: \"ea3d207c-4b27-4342-964f-80f5039fa7f6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.978973 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b42878da-c62e-425b-b147-57836dcd9a2d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-66fwq\" (UID: \"b42878da-c62e-425b-b147-57836dcd9a2d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979005 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95df5fe3-a6c2-4af6-94af-5e0540baf2a6-auth-proxy-config\") pod \"machine-approver-56656f9798-4gjq8\" (UID: \"95df5fe3-a6c2-4af6-94af-5e0540baf2a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979034 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979072 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6a0fdcf-b763-4509-8604-17ed927f48a2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wk5x7\" (UID: \"f6a0fdcf-b763-4509-8604-17ed927f48a2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979100 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1feb4b4-f628-4b37-9cb9-0e01269f5825-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kdqjq\" (UID: \"b1feb4b4-f628-4b37-9cb9-0e01269f5825\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979128 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87bwd\" (UniqueName: \"kubernetes.io/projected/77239a2f-ad60-4314-8d76-87449351907a-kube-api-access-87bwd\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979204 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979246 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f790722-e671-49a0-9d77-ead026007180-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mdd58\" (UID: \"7f790722-e671-49a0-9d77-ead026007180\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979268 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77239a2f-ad60-4314-8d76-87449351907a-node-pullsecrets\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979289 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d832b5a7-1a3a-4da8-8524-6bdf31d4fd94-config\") pod \"console-operator-58897d9998-x69p6\" (UID: \"d832b5a7-1a3a-4da8-8524-6bdf31d4fd94\") " pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979312 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-oauth-config\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979335 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/936e07e4-586f-4092-ad38-4e49512485a5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2vxck\" (UID: \"936e07e4-586f-4092-ad38-4e49512485a5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979363 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979398 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec174061-c43b-4749-8684-500ce8aaea32-config\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979435 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6a0fdcf-b763-4509-8604-17ed927f48a2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wk5x7\" (UID: \"f6a0fdcf-b763-4509-8604-17ed927f48a2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979460 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d832b5a7-1a3a-4da8-8524-6bdf31d4fd94-trusted-ca\") pod \"console-operator-58897d9998-x69p6\" (UID: \"d832b5a7-1a3a-4da8-8524-6bdf31d4fd94\") " pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979485 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-registration-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979518 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9182dd7-0f3e-4916-b53e-0a9de9fffa89-trusted-ca\") pod \"ingress-operator-5b745b69d9-xh4x8\" (UID: \"c9182dd7-0f3e-4916-b53e-0a9de9fffa89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979541 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b42878da-c62e-425b-b147-57836dcd9a2d-config\") pod \"machine-api-operator-5694c8668f-66fwq\" (UID: \"b42878da-c62e-425b-b147-57836dcd9a2d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979562 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfttn\" (UniqueName: \"kubernetes.io/projected/b42878da-c62e-425b-b147-57836dcd9a2d-kube-api-access-qfttn\") pod \"machine-api-operator-5694c8668f-66fwq\" (UID: \"b42878da-c62e-425b-b147-57836dcd9a2d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979586 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tzsd\" (UniqueName: \"kubernetes.io/projected/ec174061-c43b-4749-8684-500ce8aaea32-kube-api-access-7tzsd\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979611 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v98v7\" (UniqueName: \"kubernetes.io/projected/7f790722-e671-49a0-9d77-ead026007180-kube-api-access-v98v7\") pod \"openshift-apiserver-operator-796bbdcf4f-mdd58\" (UID: \"7f790722-e671-49a0-9d77-ead026007180\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979636 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/80186c7b-b4a2-4480-a605-18eafe6067fb-encryption-config\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979665 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-etcd-serving-ca\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979688 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979712 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec174061-c43b-4749-8684-500ce8aaea32-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979736 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6a0fdcf-b763-4509-8604-17ed927f48a2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wk5x7\" (UID: \"f6a0fdcf-b763-4509-8604-17ed927f48a2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979768 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/936e07e4-586f-4092-ad38-4e49512485a5-serving-cert\") pod \"openshift-config-operator-7777fb866f-2vxck\" (UID: \"936e07e4-586f-4092-ad38-4e49512485a5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979790 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc4vm\" (UniqueName: \"kubernetes.io/projected/936e07e4-586f-4092-ad38-4e49512485a5-kube-api-access-pc4vm\") pod \"openshift-config-operator-7777fb866f-2vxck\" (UID: \"936e07e4-586f-4092-ad38-4e49512485a5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979816 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979861 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-config\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979889 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b45473af-e174-4767-91d1-317e402f20a2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-j8g6p\" (UID: \"b45473af-e174-4767-91d1-317e402f20a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979912 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/80186c7b-b4a2-4480-a605-18eafe6067fb-etcd-client\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979936 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htb29\" (UniqueName: \"kubernetes.io/projected/7c2cc088-ced0-4542-878b-48488976518a-kube-api-access-htb29\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979958 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9182dd7-0f3e-4916-b53e-0a9de9fffa89-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xh4x8\" (UID: \"c9182dd7-0f3e-4916-b53e-0a9de9fffa89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979974 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-trusted-ca-bundle\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.979981 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea3d207c-4b27-4342-964f-80f5039fa7f6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhm6q\" (UID: \"ea3d207c-4b27-4342-964f-80f5039fa7f6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980006 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/535dd049-2145-4ecc-8165-1cad8d1e1ef1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-d7fsf\" (UID: \"535dd049-2145-4ecc-8165-1cad8d1e1ef1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980030 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-audit\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980057 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-image-import-ca\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980082 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980105 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1feb4b4-f628-4b37-9cb9-0e01269f5825-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kdqjq\" (UID: \"b1feb4b4-f628-4b37-9cb9-0e01269f5825\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980129 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec174061-c43b-4749-8684-500ce8aaea32-serving-cert\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980173 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/535dd049-2145-4ecc-8165-1cad8d1e1ef1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-d7fsf\" (UID: \"535dd049-2145-4ecc-8165-1cad8d1e1ef1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980199 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f790722-e671-49a0-9d77-ead026007180-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mdd58\" (UID: \"7f790722-e671-49a0-9d77-ead026007180\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980214 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980228 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-config\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980255 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9182dd7-0f3e-4916-b53e-0a9de9fffa89-metrics-tls\") pod \"ingress-operator-5b745b69d9-xh4x8\" (UID: \"c9182dd7-0f3e-4916-b53e-0a9de9fffa89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980285 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-serving-cert\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980311 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q7z5\" (UniqueName: \"kubernetes.io/projected/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-kube-api-access-9q7z5\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980334 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqxzl\" (UniqueName: \"kubernetes.io/projected/7e8cf764-332f-4e89-ad19-6dad90f92692-kube-api-access-nqxzl\") pod \"package-server-manager-789f6589d5-l5mhx\" (UID: \"7e8cf764-332f-4e89-ad19-6dad90f92692\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980361 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-config\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980384 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-etcd-service-ca\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980408 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-audit-policies\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980433 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz5m7\" (UniqueName: \"kubernetes.io/projected/80186c7b-b4a2-4480-a605-18eafe6067fb-kube-api-access-dz5m7\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980457 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d832b5a7-1a3a-4da8-8524-6bdf31d4fd94-serving-cert\") pod \"console-operator-58897d9998-x69p6\" (UID: \"d832b5a7-1a3a-4da8-8524-6bdf31d4fd94\") " pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980491 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bztg6\" (UniqueName: \"kubernetes.io/projected/535dd049-2145-4ecc-8165-1cad8d1e1ef1-kube-api-access-bztg6\") pod \"cluster-image-registry-operator-dc59b4c8b-d7fsf\" (UID: \"535dd049-2145-4ecc-8165-1cad8d1e1ef1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980511 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-oauth-serving-cert\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980535 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b45473af-e174-4767-91d1-317e402f20a2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-j8g6p\" (UID: \"b45473af-e174-4767-91d1-317e402f20a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980555 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-etcd-ca\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980583 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95df5fe3-a6c2-4af6-94af-5e0540baf2a6-config\") pod \"machine-approver-56656f9798-4gjq8\" (UID: \"95df5fe3-a6c2-4af6-94af-5e0540baf2a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980610 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980641 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b42878da-c62e-425b-b147-57836dcd9a2d-images\") pod \"machine-api-operator-5694c8668f-66fwq\" (UID: \"b42878da-c62e-425b-b147-57836dcd9a2d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980663 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j8jv\" (UniqueName: \"kubernetes.io/projected/d832b5a7-1a3a-4da8-8524-6bdf31d4fd94-kube-api-access-2j8jv\") pod \"console-operator-58897d9998-x69p6\" (UID: \"d832b5a7-1a3a-4da8-8524-6bdf31d4fd94\") " pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980690 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80186c7b-b4a2-4480-a605-18eafe6067fb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980712 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b45473af-e174-4767-91d1-317e402f20a2-config\") pod \"kube-apiserver-operator-766d6c64bb-j8g6p\" (UID: \"b45473af-e174-4767-91d1-317e402f20a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980736 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/95df5fe3-a6c2-4af6-94af-5e0540baf2a6-machine-approver-tls\") pod \"machine-approver-56656f9798-4gjq8\" (UID: \"95df5fe3-a6c2-4af6-94af-5e0540baf2a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980768 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5prd\" (UniqueName: \"kubernetes.io/projected/084bae12-5db3-49bc-b703-a694b692c215-kube-api-access-s5prd\") pod \"downloads-7954f5f757-86m4g\" (UID: \"084bae12-5db3-49bc-b703-a694b692c215\") " pod="openshift-console/downloads-7954f5f757-86m4g" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980791 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5496ae0a-5098-49eb-9a39-82e4d0c584bf-audit-dir\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980813 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980834 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/80186c7b-b4a2-4480-a605-18eafe6067fb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980857 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/80186c7b-b4a2-4480-a605-18eafe6067fb-audit-dir\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980884 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-service-ca\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980905 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7lf9\" (UniqueName: \"kubernetes.io/projected/5496ae0a-5098-49eb-9a39-82e4d0c584bf-kube-api-access-h7lf9\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.980929 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/80186c7b-b4a2-4480-a605-18eafe6067fb-audit-policies\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.981699 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/80186c7b-b4a2-4480-a605-18eafe6067fb-audit-policies\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.981829 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/535dd049-2145-4ecc-8165-1cad8d1e1ef1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-d7fsf\" (UID: \"535dd049-2145-4ecc-8165-1cad8d1e1ef1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.981842 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.983271 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.983909 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77239a2f-ad60-4314-8d76-87449351907a-serving-cert\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.984757 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80186c7b-b4a2-4480-a605-18eafe6067fb-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.986278 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5496ae0a-5098-49eb-9a39-82e4d0c584bf-audit-dir\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.986745 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-etcd-serving-ca\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.986957 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea3d207c-4b27-4342-964f-80f5039fa7f6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhm6q\" (UID: \"ea3d207c-4b27-4342-964f-80f5039fa7f6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.987415 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d832b5a7-1a3a-4da8-8524-6bdf31d4fd94-config\") pod \"console-operator-58897d9998-x69p6\" (UID: \"d832b5a7-1a3a-4da8-8524-6bdf31d4fd94\") " pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.987855 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-audit\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.988011 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b42878da-c62e-425b-b147-57836dcd9a2d-config\") pod \"machine-api-operator-5694c8668f-66fwq\" (UID: \"b42878da-c62e-425b-b147-57836dcd9a2d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.988688 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.989080 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.989188 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95df5fe3-a6c2-4af6-94af-5e0540baf2a6-config\") pod \"machine-approver-56656f9798-4gjq8\" (UID: \"95df5fe3-a6c2-4af6-94af-5e0540baf2a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.989445 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77239a2f-ad60-4314-8d76-87449351907a-encryption-config\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.989790 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec174061-c43b-4749-8684-500ce8aaea32-config\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.989803 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/80186c7b-b4a2-4480-a605-18eafe6067fb-etcd-client\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.989937 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-config\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.989977 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-config\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.989978 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95df5fe3-a6c2-4af6-94af-5e0540baf2a6-auth-proxy-config\") pod \"machine-approver-56656f9798-4gjq8\" (UID: \"95df5fe3-a6c2-4af6-94af-5e0540baf2a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.990095 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77239a2f-ad60-4314-8d76-87449351907a-node-pullsecrets\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.990850 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-oauth-serving-cert\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.991814 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/80186c7b-b4a2-4480-a605-18eafe6067fb-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.992036 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/77239a2f-ad60-4314-8d76-87449351907a-image-import-ca\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.992410 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f790722-e671-49a0-9d77-ead026007180-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mdd58\" (UID: \"7f790722-e671-49a0-9d77-ead026007180\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.992632 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.992850 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.992867 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.993119 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/936e07e4-586f-4092-ad38-4e49512485a5-serving-cert\") pod \"openshift-config-operator-7777fb866f-2vxck\" (UID: \"936e07e4-586f-4092-ad38-4e49512485a5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.993321 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-service-ca\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.993428 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec174061-c43b-4749-8684-500ce8aaea32-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.993630 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-audit-policies\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.993774 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b42878da-c62e-425b-b147-57836dcd9a2d-images\") pod \"machine-api-operator-5694c8668f-66fwq\" (UID: \"b42878da-c62e-425b-b147-57836dcd9a2d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.993988 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/80186c7b-b4a2-4480-a605-18eafe6067fb-audit-dir\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.994217 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.994499 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/936e07e4-586f-4092-ad38-4e49512485a5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2vxck\" (UID: \"936e07e4-586f-4092-ad38-4e49512485a5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.994658 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d832b5a7-1a3a-4da8-8524-6bdf31d4fd94-trusted-ca\") pod \"console-operator-58897d9998-x69p6\" (UID: \"d832b5a7-1a3a-4da8-8524-6bdf31d4fd94\") " pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.995479 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/95df5fe3-a6c2-4af6-94af-5e0540baf2a6-machine-approver-tls\") pod \"machine-approver-56656f9798-4gjq8\" (UID: \"95df5fe3-a6c2-4af6-94af-5e0540baf2a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.995584 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b42878da-c62e-425b-b147-57836dcd9a2d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-66fwq\" (UID: \"b42878da-c62e-425b-b147-57836dcd9a2d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.995804 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.995907 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.997518 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec174061-c43b-4749-8684-500ce8aaea32-serving-cert\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.997557 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d832b5a7-1a3a-4da8-8524-6bdf31d4fd94-serving-cert\") pod \"console-operator-58897d9998-x69p6\" (UID: \"d832b5a7-1a3a-4da8-8524-6bdf31d4fd94\") " pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.998950 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77239a2f-ad60-4314-8d76-87449351907a-etcd-client\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.997442 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80186c7b-b4a2-4480-a605-18eafe6067fb-serving-cert\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.999238 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/80186c7b-b4a2-4480-a605-18eafe6067fb-encryption-config\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:42 crc kubenswrapper[4834]: I1008 22:25:42.999331 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f790722-e671-49a0-9d77-ead026007180-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mdd58\" (UID: \"7f790722-e671-49a0-9d77-ead026007180\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.000636 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.000763 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-oauth-config\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.000851 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/535dd049-2145-4ecc-8165-1cad8d1e1ef1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-d7fsf\" (UID: \"535dd049-2145-4ecc-8165-1cad8d1e1ef1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.004076 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.016115 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.069624 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-serving-cert\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.073940 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.074089 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.076882 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.081745 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htb29\" (UniqueName: \"kubernetes.io/projected/7c2cc088-ced0-4542-878b-48488976518a-kube-api-access-htb29\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.081859 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9182dd7-0f3e-4916-b53e-0a9de9fffa89-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xh4x8\" (UID: \"c9182dd7-0f3e-4916-b53e-0a9de9fffa89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.081887 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1feb4b4-f628-4b37-9cb9-0e01269f5825-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kdqjq\" (UID: \"b1feb4b4-f628-4b37-9cb9-0e01269f5825\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.081915 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9182dd7-0f3e-4916-b53e-0a9de9fffa89-metrics-tls\") pod \"ingress-operator-5b745b69d9-xh4x8\" (UID: \"c9182dd7-0f3e-4916-b53e-0a9de9fffa89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.081943 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-config\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.081962 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-etcd-service-ca\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.081982 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q7z5\" (UniqueName: \"kubernetes.io/projected/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-kube-api-access-9q7z5\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082005 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqxzl\" (UniqueName: \"kubernetes.io/projected/7e8cf764-332f-4e89-ad19-6dad90f92692-kube-api-access-nqxzl\") pod \"package-server-manager-789f6589d5-l5mhx\" (UID: \"7e8cf764-332f-4e89-ad19-6dad90f92692\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082049 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b45473af-e174-4767-91d1-317e402f20a2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-j8g6p\" (UID: \"b45473af-e174-4767-91d1-317e402f20a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082072 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-etcd-ca\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082107 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b45473af-e174-4767-91d1-317e402f20a2-config\") pod \"kube-apiserver-operator-766d6c64bb-j8g6p\" (UID: \"b45473af-e174-4767-91d1-317e402f20a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082174 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-plugins-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082201 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-csi-data-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082228 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-socket-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082250 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-etcd-client\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082275 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-mountpoint-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082292 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-serving-cert\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082304 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-csi-data-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082314 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1feb4b4-f628-4b37-9cb9-0e01269f5825-config\") pod \"kube-controller-manager-operator-78b949d7b-kdqjq\" (UID: \"b1feb4b4-f628-4b37-9cb9-0e01269f5825\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082421 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvsrx\" (UniqueName: \"kubernetes.io/projected/c9182dd7-0f3e-4916-b53e-0a9de9fffa89-kube-api-access-tvsrx\") pod \"ingress-operator-5b745b69d9-xh4x8\" (UID: \"c9182dd7-0f3e-4916-b53e-0a9de9fffa89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082447 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e8cf764-332f-4e89-ad19-6dad90f92692-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-l5mhx\" (UID: \"7e8cf764-332f-4e89-ad19-6dad90f92692\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082515 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6a0fdcf-b763-4509-8604-17ed927f48a2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wk5x7\" (UID: \"f6a0fdcf-b763-4509-8604-17ed927f48a2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082519 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-plugins-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082537 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-socket-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082531 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1feb4b4-f628-4b37-9cb9-0e01269f5825-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kdqjq\" (UID: \"b1feb4b4-f628-4b37-9cb9-0e01269f5825\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082675 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6a0fdcf-b763-4509-8604-17ed927f48a2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wk5x7\" (UID: \"f6a0fdcf-b763-4509-8604-17ed927f48a2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082722 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-registration-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082770 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9182dd7-0f3e-4916-b53e-0a9de9fffa89-trusted-ca\") pod \"ingress-operator-5b745b69d9-xh4x8\" (UID: \"c9182dd7-0f3e-4916-b53e-0a9de9fffa89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082852 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6a0fdcf-b763-4509-8604-17ed927f48a2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wk5x7\" (UID: \"f6a0fdcf-b763-4509-8604-17ed927f48a2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082877 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-config\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082922 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-etcd-service-ca\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082941 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b45473af-e174-4767-91d1-317e402f20a2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-j8g6p\" (UID: \"b45473af-e174-4767-91d1-317e402f20a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.082983 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-mountpoint-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.083027 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7c2cc088-ced0-4542-878b-48488976518a-registration-dir\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.083066 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-etcd-ca\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.084093 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9182dd7-0f3e-4916-b53e-0a9de9fffa89-trusted-ca\") pod \"ingress-operator-5b745b69d9-xh4x8\" (UID: \"c9182dd7-0f3e-4916-b53e-0a9de9fffa89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.085324 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6a0fdcf-b763-4509-8604-17ed927f48a2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wk5x7\" (UID: \"f6a0fdcf-b763-4509-8604-17ed927f48a2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.085948 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-etcd-client\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.086409 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-serving-cert\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.089072 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c9182dd7-0f3e-4916-b53e-0a9de9fffa89-metrics-tls\") pod \"ingress-operator-5b745b69d9-xh4x8\" (UID: \"c9182dd7-0f3e-4916-b53e-0a9de9fffa89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.097783 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.103853 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6a0fdcf-b763-4509-8604-17ed927f48a2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wk5x7\" (UID: \"f6a0fdcf-b763-4509-8604-17ed927f48a2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.115242 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.136054 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.155832 4834 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.175188 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.185579 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b45473af-e174-4767-91d1-317e402f20a2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-j8g6p\" (UID: \"b45473af-e174-4767-91d1-317e402f20a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.195128 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.202959 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b45473af-e174-4767-91d1-317e402f20a2-config\") pod \"kube-apiserver-operator-766d6c64bb-j8g6p\" (UID: \"b45473af-e174-4767-91d1-317e402f20a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.237101 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.244089 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl"] Oct 08 22:25:43 crc kubenswrapper[4834]: W1008 22:25:43.252510 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a567e9e_5bea_48c9_ad1e_5ed2332f0341.slice/crio-1f779fa66ee6aeebd219965c219eaee9ddc04f52c598506a6da59be493162c08 WatchSource:0}: Error finding container 1f779fa66ee6aeebd219965c219eaee9ddc04f52c598506a6da59be493162c08: Status 404 returned error can't find the container with id 1f779fa66ee6aeebd219965c219eaee9ddc04f52c598506a6da59be493162c08 Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.275060 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.297100 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.316405 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.333109 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt"] Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.334192 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b2cdc"] Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.335661 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 08 22:25:43 crc kubenswrapper[4834]: W1008 22:25:43.343970 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88947829_240b_4ebf_9125_c03a9e4fd9df.slice/crio-1225fc322df52ff9dd9be7bcb0dd1e27f8337b969472620c23b57dc17b01249e WatchSource:0}: Error finding container 1225fc322df52ff9dd9be7bcb0dd1e27f8337b969472620c23b57dc17b01249e: Status 404 returned error can't find the container with id 1225fc322df52ff9dd9be7bcb0dd1e27f8337b969472620c23b57dc17b01249e Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.351632 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7e8cf764-332f-4e89-ad19-6dad90f92692-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-l5mhx\" (UID: \"7e8cf764-332f-4e89-ad19-6dad90f92692\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.356013 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.375587 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.377688 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" event={"ID":"88947829-240b-4ebf-9125-c03a9e4fd9df","Type":"ContainerStarted","Data":"1225fc322df52ff9dd9be7bcb0dd1e27f8337b969472620c23b57dc17b01249e"} Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.378752 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" event={"ID":"6a567e9e-5bea-48c9-ad1e-5ed2332f0341","Type":"ContainerStarted","Data":"1f779fa66ee6aeebd219965c219eaee9ddc04f52c598506a6da59be493162c08"} Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.380170 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"098fa84322c79081ec3cb6119ef4cab6f4b943293fc640a13e53c33f3b6c8cfa"} Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.380711 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.386865 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"11a9739acfc4fb83226141ef169c581743347b559d5f4752a8e85ac98074fb39"} Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.388335 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f45bc7ec37c7fa5938dddcfa98584b825a5965b0aadee9f7f21a0175eb59cc19"} Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.396012 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.415824 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.426619 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1feb4b4-f628-4b37-9cb9-0e01269f5825-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kdqjq\" (UID: \"b1feb4b4-f628-4b37-9cb9-0e01269f5825\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.435315 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.456227 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.474437 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.495328 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.515502 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.535745 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.556101 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.576929 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.595973 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.618788 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.636352 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.656327 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.681737 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.695786 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.715279 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.722908 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1feb4b4-f628-4b37-9cb9-0e01269f5825-config\") pod \"kube-controller-manager-operator-78b949d7b-kdqjq\" (UID: \"b1feb4b4-f628-4b37-9cb9-0e01269f5825\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.736041 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.756083 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.775359 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.794890 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.816409 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.835663 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.856716 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.873608 4834 request.go:700] Waited for 1.017574446s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-certs-default&limit=500&resourceVersion=0 Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.874850 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.895273 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.915928 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.935190 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.956065 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.975447 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 08 22:25:43 crc kubenswrapper[4834]: I1008 22:25:43.996089 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.016268 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.036010 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.055564 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.076188 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.096223 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.116342 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.135243 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.156717 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.176125 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.196492 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.216533 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.236735 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.257274 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.276266 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.296409 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.325344 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.336957 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.357093 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.376191 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.396030 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" event={"ID":"88947829-240b-4ebf-9125-c03a9e4fd9df","Type":"ContainerStarted","Data":"e552e385de5df8727f317cd27c7a7c1dbf2b469ec904a5bdd82f527816c25f98"} Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.396057 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.396564 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.398334 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt" event={"ID":"86ddb047-3ac1-4136-a4a2-37469c06fa91","Type":"ContainerStarted","Data":"bc3f7caf2789e165ebcd0c6c35614a8a21eb61945d61b7da4dae21c7f5869388"} Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.398452 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt" event={"ID":"86ddb047-3ac1-4136-a4a2-37469c06fa91","Type":"ContainerStarted","Data":"e0b258280bad77c5ec2b2405de44ffac5333d5fa40ebec1a2e6ea9ebf9dc3b94"} Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.400198 4834 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-b2cdc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.400327 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" podUID="88947829-240b-4ebf-9125-c03a9e4fd9df" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.400721 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" event={"ID":"6a567e9e-5bea-48c9-ad1e-5ed2332f0341","Type":"ContainerStarted","Data":"e2902d70b98b5442828b88e375972daf16e8bb4ef6c59fd02f4e74112a89f6f9"} Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.416232 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.436477 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.456014 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.476465 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.496446 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.515901 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.535892 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.556114 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.576785 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.596308 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.617749 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.635636 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.656221 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.676117 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.731406 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk447\" (UniqueName: \"kubernetes.io/projected/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-kube-api-access-bk447\") pod \"console-f9d7485db-lbmrk\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.736494 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhfg2\" (UniqueName: \"kubernetes.io/projected/95df5fe3-a6c2-4af6-94af-5e0540baf2a6-kube-api-access-xhfg2\") pod \"machine-approver-56656f9798-4gjq8\" (UID: \"95df5fe3-a6c2-4af6-94af-5e0540baf2a6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.756805 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v98v7\" (UniqueName: \"kubernetes.io/projected/7f790722-e671-49a0-9d77-ead026007180-kube-api-access-v98v7\") pod \"openshift-apiserver-operator-796bbdcf4f-mdd58\" (UID: \"7f790722-e671-49a0-9d77-ead026007180\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.763859 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.778263 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/535dd049-2145-4ecc-8165-1cad8d1e1ef1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-d7fsf\" (UID: \"535dd049-2145-4ecc-8165-1cad8d1e1ef1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.795666 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxfls\" (UniqueName: \"kubernetes.io/projected/ea3d207c-4b27-4342-964f-80f5039fa7f6-kube-api-access-gxfls\") pod \"openshift-controller-manager-operator-756b6f6bc6-nhm6q\" (UID: \"ea3d207c-4b27-4342-964f-80f5039fa7f6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.815557 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc4vm\" (UniqueName: \"kubernetes.io/projected/936e07e4-586f-4092-ad38-4e49512485a5-kube-api-access-pc4vm\") pod \"openshift-config-operator-7777fb866f-2vxck\" (UID: \"936e07e4-586f-4092-ad38-4e49512485a5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.826856 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.834250 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bztg6\" (UniqueName: \"kubernetes.io/projected/535dd049-2145-4ecc-8165-1cad8d1e1ef1-kube-api-access-bztg6\") pod \"cluster-image-registry-operator-dc59b4c8b-d7fsf\" (UID: \"535dd049-2145-4ecc-8165-1cad8d1e1ef1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.861361 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfttn\" (UniqueName: \"kubernetes.io/projected/b42878da-c62e-425b-b147-57836dcd9a2d-kube-api-access-qfttn\") pod \"machine-api-operator-5694c8668f-66fwq\" (UID: \"b42878da-c62e-425b-b147-57836dcd9a2d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.874112 4834 request.go:700] Waited for 1.88380402s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/default/token Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.874501 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87bwd\" (UniqueName: \"kubernetes.io/projected/77239a2f-ad60-4314-8d76-87449351907a-kube-api-access-87bwd\") pod \"apiserver-76f77b778f-qqn48\" (UID: \"77239a2f-ad60-4314-8d76-87449351907a\") " pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.892084 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5prd\" (UniqueName: \"kubernetes.io/projected/084bae12-5db3-49bc-b703-a694b692c215-kube-api-access-s5prd\") pod \"downloads-7954f5f757-86m4g\" (UID: \"084bae12-5db3-49bc-b703-a694b692c215\") " pod="openshift-console/downloads-7954f5f757-86m4g" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.912868 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tzsd\" (UniqueName: \"kubernetes.io/projected/ec174061-c43b-4749-8684-500ce8aaea32-kube-api-access-7tzsd\") pod \"authentication-operator-69f744f599-nn8k8\" (UID: \"ec174061-c43b-4749-8684-500ce8aaea32\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.913035 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" Oct 08 22:25:44 crc kubenswrapper[4834]: W1008 22:25:44.937333 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95df5fe3_a6c2_4af6_94af_5e0540baf2a6.slice/crio-a079f45d4d707a7ed3372a827fcc5a00baf9ca8d75db78add28bcabcf0b922ea WatchSource:0}: Error finding container a079f45d4d707a7ed3372a827fcc5a00baf9ca8d75db78add28bcabcf0b922ea: Status 404 returned error can't find the container with id a079f45d4d707a7ed3372a827fcc5a00baf9ca8d75db78add28bcabcf0b922ea Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.938215 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz5m7\" (UniqueName: \"kubernetes.io/projected/80186c7b-b4a2-4480-a605-18eafe6067fb-kube-api-access-dz5m7\") pod \"apiserver-7bbb656c7d-9v995\" (UID: \"80186c7b-b4a2-4480-a605-18eafe6067fb\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.958020 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.962322 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7lf9\" (UniqueName: \"kubernetes.io/projected/5496ae0a-5098-49eb-9a39-82e4d0c584bf-kube-api-access-h7lf9\") pod \"oauth-openshift-558db77b4-4v5sm\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.975434 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.978190 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j8jv\" (UniqueName: \"kubernetes.io/projected/d832b5a7-1a3a-4da8-8524-6bdf31d4fd94-kube-api-access-2j8jv\") pod \"console-operator-58897d9998-x69p6\" (UID: \"d832b5a7-1a3a-4da8-8524-6bdf31d4fd94\") " pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.985111 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58"] Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.991495 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:44 crc kubenswrapper[4834]: I1008 22:25:44.995868 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9182dd7-0f3e-4916-b53e-0a9de9fffa89-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xh4x8\" (UID: \"c9182dd7-0f3e-4916-b53e-0a9de9fffa89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.006609 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" Oct 08 22:25:45 crc kubenswrapper[4834]: W1008 22:25:45.009893 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f790722_e671_49a0_9d77_ead026007180.slice/crio-e5d304d0779d2966f47982ea510247be197c8c0952e0a902f669be3999f09db8 WatchSource:0}: Error finding container e5d304d0779d2966f47982ea510247be197c8c0952e0a902f669be3999f09db8: Status 404 returned error can't find the container with id e5d304d0779d2966f47982ea510247be197c8c0952e0a902f669be3999f09db8 Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.010686 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htb29\" (UniqueName: \"kubernetes.io/projected/7c2cc088-ced0-4542-878b-48488976518a-kube-api-access-htb29\") pod \"csi-hostpathplugin-gxfrf\" (UID: \"7c2cc088-ced0-4542-878b-48488976518a\") " pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.028859 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.034355 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q7z5\" (UniqueName: \"kubernetes.io/projected/5e75cda3-14b7-4333-a8b8-e617fd72b6a1-kube-api-access-9q7z5\") pod \"etcd-operator-b45778765-rdz6s\" (UID: \"5e75cda3-14b7-4333-a8b8-e617fd72b6a1\") " pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.045623 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-86m4g" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.050796 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q"] Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.057783 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqxzl\" (UniqueName: \"kubernetes.io/projected/7e8cf764-332f-4e89-ad19-6dad90f92692-kube-api-access-nqxzl\") pod \"package-server-manager-789f6589d5-l5mhx\" (UID: \"7e8cf764-332f-4e89-ad19-6dad90f92692\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.068500 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.075164 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1feb4b4-f628-4b37-9cb9-0e01269f5825-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kdqjq\" (UID: \"b1feb4b4-f628-4b37-9cb9-0e01269f5825\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.094387 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.099457 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvsrx\" (UniqueName: \"kubernetes.io/projected/c9182dd7-0f3e-4916-b53e-0a9de9fffa89-kube-api-access-tvsrx\") pod \"ingress-operator-5b745b69d9-xh4x8\" (UID: \"c9182dd7-0f3e-4916-b53e-0a9de9fffa89\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.112669 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6a0fdcf-b763-4509-8604-17ed927f48a2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wk5x7\" (UID: \"f6a0fdcf-b763-4509-8604-17ed927f48a2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.113234 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.118457 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.130662 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b45473af-e174-4767-91d1-317e402f20a2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-j8g6p\" (UID: \"b45473af-e174-4767-91d1-317e402f20a2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.168657 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.171344 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.176275 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.204904 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.209396 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.214571 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2w6v\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-kube-api-access-p2w6v\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.214626 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-bound-sa-token\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.214703 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a975074f-5780-405c-bf73-36ebcaf7bb06-registry-certificates\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.214731 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a975074f-5780-405c-bf73-36ebcaf7bb06-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.214758 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a975074f-5780-405c-bf73-36ebcaf7bb06-trusted-ca\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.214786 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.214882 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a975074f-5780-405c-bf73-36ebcaf7bb06-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.214909 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-registry-tls\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: E1008 22:25:45.215386 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:45.715368201 +0000 UTC m=+153.538252947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.252413 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qqn48"] Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.316918 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.317094 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stscm\" (UniqueName: \"kubernetes.io/projected/6e9dedc4-5147-4980-8501-834fe5976c00-kube-api-access-stscm\") pod \"machine-config-controller-84d6567774-f6zdn\" (UID: \"6e9dedc4-5147-4980-8501-834fe5976c00\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.317119 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6nkp\" (UniqueName: \"kubernetes.io/projected/933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e-kube-api-access-c6nkp\") pod \"dns-operator-744455d44c-4xwnj\" (UID: \"933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e\") " pod="openshift-dns-operator/dns-operator-744455d44c-4xwnj" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.317216 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xsxk5\" (UID: \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\") " pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.317306 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-registry-tls\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.318426 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5fbb5d9-094a-4948-a1f2-2f2b84bae26d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-m5dj6\" (UID: \"f5fbb5d9-094a-4948-a1f2-2f2b84bae26d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-m5dj6" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.318520 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pw26\" (UniqueName: \"kubernetes.io/projected/3c302d50-f83a-448d-a914-905ec04ada98-kube-api-access-6pw26\") pod \"collect-profiles-29332695-bh7rb\" (UID: \"3c302d50-f83a-448d-a914-905ec04ada98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.318581 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d09cf73-0239-4398-b142-17fc3892aca9-serving-cert\") pod \"service-ca-operator-777779d784-q7hzk\" (UID: \"1d09cf73-0239-4398-b142-17fc3892aca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.318636 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qms8w\" (UniqueName: \"kubernetes.io/projected/faa59b6b-6fdb-47fa-9b3e-92502adc3c70-kube-api-access-qms8w\") pod \"ingress-canary-sfxxw\" (UID: \"faa59b6b-6fdb-47fa-9b3e-92502adc3c70\") " pod="openshift-ingress-canary/ingress-canary-sfxxw" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.318675 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34467af5-f30c-4d07-a6cb-9d300338f8a1-metrics-tls\") pod \"dns-default-r2fr4\" (UID: \"34467af5-f30c-4d07-a6cb-9d300338f8a1\") " pod="openshift-dns/dns-default-r2fr4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.318747 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l45sj\" (UniqueName: \"kubernetes.io/projected/34467af5-f30c-4d07-a6cb-9d300338f8a1-kube-api-access-l45sj\") pod \"dns-default-r2fr4\" (UID: \"34467af5-f30c-4d07-a6cb-9d300338f8a1\") " pod="openshift-dns/dns-default-r2fr4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.318770 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv2dj\" (UniqueName: \"kubernetes.io/projected/fe82bfed-8295-42b8-bd74-1883fd35bab6-kube-api-access-jv2dj\") pod \"multus-admission-controller-857f4d67dd-sr6sl\" (UID: \"fe82bfed-8295-42b8-bd74-1883fd35bab6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sr6sl" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.318790 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2w6v\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-kube-api-access-p2w6v\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.318834 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-bound-sa-token\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.318878 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d09cf73-0239-4398-b142-17fc3892aca9-config\") pod \"service-ca-operator-777779d784-q7hzk\" (UID: \"1d09cf73-0239-4398-b142-17fc3892aca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.318933 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1b754933-c10c-4ae3-9590-abca704681c6-images\") pod \"machine-config-operator-74547568cd-7ktg9\" (UID: \"1b754933-c10c-4ae3-9590-abca704681c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.318957 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fe82bfed-8295-42b8-bd74-1883fd35bab6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sr6sl\" (UID: \"fe82bfed-8295-42b8-bd74-1883fd35bab6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sr6sl" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.318996 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b754933-c10c-4ae3-9590-abca704681c6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7ktg9\" (UID: \"1b754933-c10c-4ae3-9590-abca704681c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319019 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d32e007d-c10b-46e2-8335-05743b165d1b-node-bootstrap-token\") pod \"machine-config-server-j64fd\" (UID: \"d32e007d-c10b-46e2-8335-05743b165d1b\") " pod="openshift-machine-config-operator/machine-config-server-j64fd" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319055 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtkg4\" (UniqueName: \"kubernetes.io/projected/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-kube-api-access-dtkg4\") pod \"marketplace-operator-79b997595-xsxk5\" (UID: \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\") " pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319077 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa59b6b-6fdb-47fa-9b3e-92502adc3c70-cert\") pod \"ingress-canary-sfxxw\" (UID: \"faa59b6b-6fdb-47fa-9b3e-92502adc3c70\") " pod="openshift-ingress-canary/ingress-canary-sfxxw" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319185 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vbzv\" (UniqueName: \"kubernetes.io/projected/1b754933-c10c-4ae3-9590-abca704681c6-kube-api-access-5vbzv\") pod \"machine-config-operator-74547568cd-7ktg9\" (UID: \"1b754933-c10c-4ae3-9590-abca704681c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319214 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-service-ca-bundle\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319289 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18524e10-53d8-4201-9c5c-2564d88cfbfa-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-txrxb\" (UID: \"18524e10-53d8-4201-9c5c-2564d88cfbfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319331 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkczg\" (UniqueName: \"kubernetes.io/projected/f5fbb5d9-094a-4948-a1f2-2f2b84bae26d-kube-api-access-dkczg\") pod \"control-plane-machine-set-operator-78cbb6b69f-m5dj6\" (UID: \"f5fbb5d9-094a-4948-a1f2-2f2b84bae26d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-m5dj6" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319400 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b754933-c10c-4ae3-9590-abca704681c6-proxy-tls\") pod \"machine-config-operator-74547568cd-7ktg9\" (UID: \"1b754933-c10c-4ae3-9590-abca704681c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319425 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f58858fd-7852-4081-892a-c1a815e56827-signing-cabundle\") pod \"service-ca-9c57cc56f-5jgwz\" (UID: \"f58858fd-7852-4081-892a-c1a815e56827\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jgwz" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319448 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d-srv-cert\") pod \"olm-operator-6b444d44fb-zhdx4\" (UID: \"0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319500 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-stats-auth\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319541 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18524e10-53d8-4201-9c5c-2564d88cfbfa-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-txrxb\" (UID: \"18524e10-53d8-4201-9c5c-2564d88cfbfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319595 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a975074f-5780-405c-bf73-36ebcaf7bb06-registry-certificates\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319618 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bc74cae8-79d9-47fd-aaf4-0bf056569277-tmpfs\") pod \"packageserver-d55dfcdfc-m4dfc\" (UID: \"bc74cae8-79d9-47fd-aaf4-0bf056569277\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319668 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn75b\" (UniqueName: \"kubernetes.io/projected/d32e007d-c10b-46e2-8335-05743b165d1b-kube-api-access-mn75b\") pod \"machine-config-server-j64fd\" (UID: \"d32e007d-c10b-46e2-8335-05743b165d1b\") " pod="openshift-machine-config-operator/machine-config-server-j64fd" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319689 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc74cae8-79d9-47fd-aaf4-0bf056569277-apiservice-cert\") pod \"packageserver-d55dfcdfc-m4dfc\" (UID: \"bc74cae8-79d9-47fd-aaf4-0bf056569277\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319725 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f58858fd-7852-4081-892a-c1a815e56827-signing-key\") pod \"service-ca-9c57cc56f-5jgwz\" (UID: \"f58858fd-7852-4081-892a-c1a815e56827\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jgwz" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319750 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e9dedc4-5147-4980-8501-834fe5976c00-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-f6zdn\" (UID: \"6e9dedc4-5147-4980-8501-834fe5976c00\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319771 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e-metrics-tls\") pod \"dns-operator-744455d44c-4xwnj\" (UID: \"933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e\") " pod="openshift-dns-operator/dns-operator-744455d44c-4xwnj" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319793 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a1cdcf6-51bb-46e5-a3b7-80210921f1a6-srv-cert\") pod \"catalog-operator-68c6474976-66msg\" (UID: \"6a1cdcf6-51bb-46e5-a3b7-80210921f1a6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319892 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zhdx4\" (UID: \"0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319916 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-default-certificate\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.319944 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc74cae8-79d9-47fd-aaf4-0bf056569277-webhook-cert\") pod \"packageserver-d55dfcdfc-m4dfc\" (UID: \"bc74cae8-79d9-47fd-aaf4-0bf056569277\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:45 crc kubenswrapper[4834]: E1008 22:25:45.320010 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:45.819958783 +0000 UTC m=+153.642843529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320449 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34467af5-f30c-4d07-a6cb-9d300338f8a1-config-volume\") pod \"dns-default-r2fr4\" (UID: \"34467af5-f30c-4d07-a6cb-9d300338f8a1\") " pod="openshift-dns/dns-default-r2fr4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320484 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xsxk5\" (UID: \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\") " pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320511 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcs58\" (UniqueName: \"kubernetes.io/projected/bc74cae8-79d9-47fd-aaf4-0bf056569277-kube-api-access-vcs58\") pod \"packageserver-d55dfcdfc-m4dfc\" (UID: \"bc74cae8-79d9-47fd-aaf4-0bf056569277\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320533 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d32e007d-c10b-46e2-8335-05743b165d1b-certs\") pod \"machine-config-server-j64fd\" (UID: \"d32e007d-c10b-46e2-8335-05743b165d1b\") " pod="openshift-machine-config-operator/machine-config-server-j64fd" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320606 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a975074f-5780-405c-bf73-36ebcaf7bb06-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320647 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pr2q\" (UniqueName: \"kubernetes.io/projected/0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d-kube-api-access-4pr2q\") pod \"olm-operator-6b444d44fb-zhdx4\" (UID: \"0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320684 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dgdx\" (UniqueName: \"kubernetes.io/projected/f58858fd-7852-4081-892a-c1a815e56827-kube-api-access-8dgdx\") pod \"service-ca-9c57cc56f-5jgwz\" (UID: \"f58858fd-7852-4081-892a-c1a815e56827\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jgwz" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320731 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74t8w\" (UniqueName: \"kubernetes.io/projected/1d09cf73-0239-4398-b142-17fc3892aca9-kube-api-access-74t8w\") pod \"service-ca-operator-777779d784-q7hzk\" (UID: \"1d09cf73-0239-4398-b142-17fc3892aca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320749 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzbxl\" (UniqueName: \"kubernetes.io/projected/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-kube-api-access-qzbxl\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320770 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwvbw\" (UniqueName: \"kubernetes.io/projected/c012967e-0c3a-4ce9-9c65-b0b84e3c0308-kube-api-access-gwvbw\") pod \"migrator-59844c95c7-9lpgw\" (UID: \"c012967e-0c3a-4ce9-9c65-b0b84e3c0308\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9lpgw" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320859 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a975074f-5780-405c-bf73-36ebcaf7bb06-trusted-ca\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320881 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e9dedc4-5147-4980-8501-834fe5976c00-proxy-tls\") pod \"machine-config-controller-84d6567774-f6zdn\" (UID: \"6e9dedc4-5147-4980-8501-834fe5976c00\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320899 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtjld\" (UniqueName: \"kubernetes.io/projected/18524e10-53d8-4201-9c5c-2564d88cfbfa-kube-api-access-mtjld\") pod \"kube-storage-version-migrator-operator-b67b599dd-txrxb\" (UID: \"18524e10-53d8-4201-9c5c-2564d88cfbfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320931 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-metrics-certs\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.320953 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c302d50-f83a-448d-a914-905ec04ada98-config-volume\") pod \"collect-profiles-29332695-bh7rb\" (UID: \"3c302d50-f83a-448d-a914-905ec04ada98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.321210 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.321240 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a975074f-5780-405c-bf73-36ebcaf7bb06-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.321293 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a1cdcf6-51bb-46e5-a3b7-80210921f1a6-profile-collector-cert\") pod \"catalog-operator-68c6474976-66msg\" (UID: \"6a1cdcf6-51bb-46e5-a3b7-80210921f1a6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.321311 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lzrr\" (UniqueName: \"kubernetes.io/projected/6a1cdcf6-51bb-46e5-a3b7-80210921f1a6-kube-api-access-8lzrr\") pod \"catalog-operator-68c6474976-66msg\" (UID: \"6a1cdcf6-51bb-46e5-a3b7-80210921f1a6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.321327 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c302d50-f83a-448d-a914-905ec04ada98-secret-volume\") pod \"collect-profiles-29332695-bh7rb\" (UID: \"3c302d50-f83a-448d-a914-905ec04ada98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.326955 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a975074f-5780-405c-bf73-36ebcaf7bb06-registry-certificates\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.328431 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a975074f-5780-405c-bf73-36ebcaf7bb06-trusted-ca\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: E1008 22:25:45.332610 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:45.832586857 +0000 UTC m=+153.655471603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.332834 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a975074f-5780-405c-bf73-36ebcaf7bb06-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.334599 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a975074f-5780-405c-bf73-36ebcaf7bb06-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.356095 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-registry-tls\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.364429 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2w6v\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-kube-api-access-p2w6v\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: W1008 22:25:45.367207 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77239a2f_ad60_4314_8d76_87449351907a.slice/crio-ed51e70b243aea57a62a78dc3c4846ecd24ba11174757bc930d64059f42bb70b WatchSource:0}: Error finding container ed51e70b243aea57a62a78dc3c4846ecd24ba11174757bc930d64059f42bb70b: Status 404 returned error can't find the container with id ed51e70b243aea57a62a78dc3c4846ecd24ba11174757bc930d64059f42bb70b Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.371287 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.390335 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-bound-sa-token\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.407472 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.419011 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58" event={"ID":"7f790722-e671-49a0-9d77-ead026007180","Type":"ContainerStarted","Data":"c36eaa70dcc242701d9ae68ca18eaef878c5b3ab6541ed501ba8181345427ed8"} Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.419083 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58" event={"ID":"7f790722-e671-49a0-9d77-ead026007180","Type":"ContainerStarted","Data":"e5d304d0779d2966f47982ea510247be197c8c0952e0a902f669be3999f09db8"} Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.422407 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt" event={"ID":"86ddb047-3ac1-4136-a4a2-37469c06fa91","Type":"ContainerStarted","Data":"0cdedafaf77a9ec8374dd2acc32468485b6cd95b680d1a930ddb3d0ef71fbbe8"} Oct 08 22:25:45 crc kubenswrapper[4834]: E1008 22:25:45.424118 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:45.924095082 +0000 UTC m=+153.746979828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.424385 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" event={"ID":"95df5fe3-a6c2-4af6-94af-5e0540baf2a6","Type":"ContainerStarted","Data":"a079f45d4d707a7ed3372a827fcc5a00baf9ca8d75db78add28bcabcf0b922ea"} Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.424810 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.427874 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q" event={"ID":"ea3d207c-4b27-4342-964f-80f5039fa7f6","Type":"ContainerStarted","Data":"ee1250749c8eef48b03b3c2e49c70a5ed77442ab5e572f9f03ad38a6c6cccec0"} Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.428387 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.428777 4834 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-b2cdc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.428826 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" podUID="88947829-240b-4ebf-9125-c03a9e4fd9df" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.429324 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f58858fd-7852-4081-892a-c1a815e56827-signing-key\") pod \"service-ca-9c57cc56f-5jgwz\" (UID: \"f58858fd-7852-4081-892a-c1a815e56827\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jgwz" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.429365 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e9dedc4-5147-4980-8501-834fe5976c00-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-f6zdn\" (UID: \"6e9dedc4-5147-4980-8501-834fe5976c00\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.429389 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e-metrics-tls\") pod \"dns-operator-744455d44c-4xwnj\" (UID: \"933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e\") " pod="openshift-dns-operator/dns-operator-744455d44c-4xwnj" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.429411 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a1cdcf6-51bb-46e5-a3b7-80210921f1a6-srv-cert\") pod \"catalog-operator-68c6474976-66msg\" (UID: \"6a1cdcf6-51bb-46e5-a3b7-80210921f1a6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.429439 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zhdx4\" (UID: \"0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.429458 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xsxk5\" (UID: \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\") " pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.429477 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-default-certificate\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.429494 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc74cae8-79d9-47fd-aaf4-0bf056569277-webhook-cert\") pod \"packageserver-d55dfcdfc-m4dfc\" (UID: \"bc74cae8-79d9-47fd-aaf4-0bf056569277\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.429514 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34467af5-f30c-4d07-a6cb-9d300338f8a1-config-volume\") pod \"dns-default-r2fr4\" (UID: \"34467af5-f30c-4d07-a6cb-9d300338f8a1\") " pod="openshift-dns/dns-default-r2fr4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.429535 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcs58\" (UniqueName: \"kubernetes.io/projected/bc74cae8-79d9-47fd-aaf4-0bf056569277-kube-api-access-vcs58\") pod \"packageserver-d55dfcdfc-m4dfc\" (UID: \"bc74cae8-79d9-47fd-aaf4-0bf056569277\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.429555 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d32e007d-c10b-46e2-8335-05743b165d1b-certs\") pod \"machine-config-server-j64fd\" (UID: \"d32e007d-c10b-46e2-8335-05743b165d1b\") " pod="openshift-machine-config-operator/machine-config-server-j64fd" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.429577 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pr2q\" (UniqueName: \"kubernetes.io/projected/0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d-kube-api-access-4pr2q\") pod \"olm-operator-6b444d44fb-zhdx4\" (UID: \"0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.429597 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dgdx\" (UniqueName: \"kubernetes.io/projected/f58858fd-7852-4081-892a-c1a815e56827-kube-api-access-8dgdx\") pod \"service-ca-9c57cc56f-5jgwz\" (UID: \"f58858fd-7852-4081-892a-c1a815e56827\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jgwz" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.429619 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwvbw\" (UniqueName: \"kubernetes.io/projected/c012967e-0c3a-4ce9-9c65-b0b84e3c0308-kube-api-access-gwvbw\") pod \"migrator-59844c95c7-9lpgw\" (UID: \"c012967e-0c3a-4ce9-9c65-b0b84e3c0308\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9lpgw" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.431015 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34467af5-f30c-4d07-a6cb-9d300338f8a1-config-volume\") pod \"dns-default-r2fr4\" (UID: \"34467af5-f30c-4d07-a6cb-9d300338f8a1\") " pod="openshift-dns/dns-default-r2fr4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.431231 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74t8w\" (UniqueName: \"kubernetes.io/projected/1d09cf73-0239-4398-b142-17fc3892aca9-kube-api-access-74t8w\") pod \"service-ca-operator-777779d784-q7hzk\" (UID: \"1d09cf73-0239-4398-b142-17fc3892aca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.431271 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzbxl\" (UniqueName: \"kubernetes.io/projected/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-kube-api-access-qzbxl\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.431296 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtjld\" (UniqueName: \"kubernetes.io/projected/18524e10-53d8-4201-9c5c-2564d88cfbfa-kube-api-access-mtjld\") pod \"kube-storage-version-migrator-operator-b67b599dd-txrxb\" (UID: \"18524e10-53d8-4201-9c5c-2564d88cfbfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432360 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e9dedc4-5147-4980-8501-834fe5976c00-proxy-tls\") pod \"machine-config-controller-84d6567774-f6zdn\" (UID: \"6e9dedc4-5147-4980-8501-834fe5976c00\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432397 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-metrics-certs\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432423 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c302d50-f83a-448d-a914-905ec04ada98-config-volume\") pod \"collect-profiles-29332695-bh7rb\" (UID: \"3c302d50-f83a-448d-a914-905ec04ada98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432454 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432474 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lzrr\" (UniqueName: \"kubernetes.io/projected/6a1cdcf6-51bb-46e5-a3b7-80210921f1a6-kube-api-access-8lzrr\") pod \"catalog-operator-68c6474976-66msg\" (UID: \"6a1cdcf6-51bb-46e5-a3b7-80210921f1a6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432493 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a1cdcf6-51bb-46e5-a3b7-80210921f1a6-profile-collector-cert\") pod \"catalog-operator-68c6474976-66msg\" (UID: \"6a1cdcf6-51bb-46e5-a3b7-80210921f1a6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432511 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c302d50-f83a-448d-a914-905ec04ada98-secret-volume\") pod \"collect-profiles-29332695-bh7rb\" (UID: \"3c302d50-f83a-448d-a914-905ec04ada98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432527 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xsxk5\" (UID: \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\") " pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432546 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stscm\" (UniqueName: \"kubernetes.io/projected/6e9dedc4-5147-4980-8501-834fe5976c00-kube-api-access-stscm\") pod \"machine-config-controller-84d6567774-f6zdn\" (UID: \"6e9dedc4-5147-4980-8501-834fe5976c00\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432562 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6nkp\" (UniqueName: \"kubernetes.io/projected/933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e-kube-api-access-c6nkp\") pod \"dns-operator-744455d44c-4xwnj\" (UID: \"933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e\") " pod="openshift-dns-operator/dns-operator-744455d44c-4xwnj" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432594 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5fbb5d9-094a-4948-a1f2-2f2b84bae26d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-m5dj6\" (UID: \"f5fbb5d9-094a-4948-a1f2-2f2b84bae26d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-m5dj6" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432637 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pw26\" (UniqueName: \"kubernetes.io/projected/3c302d50-f83a-448d-a914-905ec04ada98-kube-api-access-6pw26\") pod \"collect-profiles-29332695-bh7rb\" (UID: \"3c302d50-f83a-448d-a914-905ec04ada98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432648 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xsxk5\" (UID: \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\") " pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432658 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d09cf73-0239-4398-b142-17fc3892aca9-serving-cert\") pod \"service-ca-operator-777779d784-q7hzk\" (UID: \"1d09cf73-0239-4398-b142-17fc3892aca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432681 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qms8w\" (UniqueName: \"kubernetes.io/projected/faa59b6b-6fdb-47fa-9b3e-92502adc3c70-kube-api-access-qms8w\") pod \"ingress-canary-sfxxw\" (UID: \"faa59b6b-6fdb-47fa-9b3e-92502adc3c70\") " pod="openshift-ingress-canary/ingress-canary-sfxxw" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432709 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34467af5-f30c-4d07-a6cb-9d300338f8a1-metrics-tls\") pod \"dns-default-r2fr4\" (UID: \"34467af5-f30c-4d07-a6cb-9d300338f8a1\") " pod="openshift-dns/dns-default-r2fr4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432739 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l45sj\" (UniqueName: \"kubernetes.io/projected/34467af5-f30c-4d07-a6cb-9d300338f8a1-kube-api-access-l45sj\") pod \"dns-default-r2fr4\" (UID: \"34467af5-f30c-4d07-a6cb-9d300338f8a1\") " pod="openshift-dns/dns-default-r2fr4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432760 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv2dj\" (UniqueName: \"kubernetes.io/projected/fe82bfed-8295-42b8-bd74-1883fd35bab6-kube-api-access-jv2dj\") pod \"multus-admission-controller-857f4d67dd-sr6sl\" (UID: \"fe82bfed-8295-42b8-bd74-1883fd35bab6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sr6sl" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432788 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d09cf73-0239-4398-b142-17fc3892aca9-config\") pod \"service-ca-operator-777779d784-q7hzk\" (UID: \"1d09cf73-0239-4398-b142-17fc3892aca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432812 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1b754933-c10c-4ae3-9590-abca704681c6-images\") pod \"machine-config-operator-74547568cd-7ktg9\" (UID: \"1b754933-c10c-4ae3-9590-abca704681c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432829 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fe82bfed-8295-42b8-bd74-1883fd35bab6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sr6sl\" (UID: \"fe82bfed-8295-42b8-bd74-1883fd35bab6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sr6sl" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432853 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d32e007d-c10b-46e2-8335-05743b165d1b-node-bootstrap-token\") pod \"machine-config-server-j64fd\" (UID: \"d32e007d-c10b-46e2-8335-05743b165d1b\") " pod="openshift-machine-config-operator/machine-config-server-j64fd" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432881 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b754933-c10c-4ae3-9590-abca704681c6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7ktg9\" (UID: \"1b754933-c10c-4ae3-9590-abca704681c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.432899 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtkg4\" (UniqueName: \"kubernetes.io/projected/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-kube-api-access-dtkg4\") pod \"marketplace-operator-79b997595-xsxk5\" (UID: \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\") " pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.433736 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c302d50-f83a-448d-a914-905ec04ada98-config-volume\") pod \"collect-profiles-29332695-bh7rb\" (UID: \"3c302d50-f83a-448d-a914-905ec04ada98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.434423 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e9dedc4-5147-4980-8501-834fe5976c00-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-f6zdn\" (UID: \"6e9dedc4-5147-4980-8501-834fe5976c00\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.436796 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa59b6b-6fdb-47fa-9b3e-92502adc3c70-cert\") pod \"ingress-canary-sfxxw\" (UID: \"faa59b6b-6fdb-47fa-9b3e-92502adc3c70\") " pod="openshift-ingress-canary/ingress-canary-sfxxw" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.436926 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vbzv\" (UniqueName: \"kubernetes.io/projected/1b754933-c10c-4ae3-9590-abca704681c6-kube-api-access-5vbzv\") pod \"machine-config-operator-74547568cd-7ktg9\" (UID: \"1b754933-c10c-4ae3-9590-abca704681c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.436954 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-service-ca-bundle\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.436978 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18524e10-53d8-4201-9c5c-2564d88cfbfa-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-txrxb\" (UID: \"18524e10-53d8-4201-9c5c-2564d88cfbfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.437012 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkczg\" (UniqueName: \"kubernetes.io/projected/f5fbb5d9-094a-4948-a1f2-2f2b84bae26d-kube-api-access-dkczg\") pod \"control-plane-machine-set-operator-78cbb6b69f-m5dj6\" (UID: \"f5fbb5d9-094a-4948-a1f2-2f2b84bae26d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-m5dj6" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.437039 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b754933-c10c-4ae3-9590-abca704681c6-proxy-tls\") pod \"machine-config-operator-74547568cd-7ktg9\" (UID: \"1b754933-c10c-4ae3-9590-abca704681c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.437058 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f58858fd-7852-4081-892a-c1a815e56827-signing-cabundle\") pod \"service-ca-9c57cc56f-5jgwz\" (UID: \"f58858fd-7852-4081-892a-c1a815e56827\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jgwz" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.437079 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-stats-auth\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.437098 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d-srv-cert\") pod \"olm-operator-6b444d44fb-zhdx4\" (UID: \"0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.437127 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18524e10-53d8-4201-9c5c-2564d88cfbfa-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-txrxb\" (UID: \"18524e10-53d8-4201-9c5c-2564d88cfbfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.437170 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bc74cae8-79d9-47fd-aaf4-0bf056569277-tmpfs\") pod \"packageserver-d55dfcdfc-m4dfc\" (UID: \"bc74cae8-79d9-47fd-aaf4-0bf056569277\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.437198 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn75b\" (UniqueName: \"kubernetes.io/projected/d32e007d-c10b-46e2-8335-05743b165d1b-kube-api-access-mn75b\") pod \"machine-config-server-j64fd\" (UID: \"d32e007d-c10b-46e2-8335-05743b165d1b\") " pod="openshift-machine-config-operator/machine-config-server-j64fd" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.437219 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc74cae8-79d9-47fd-aaf4-0bf056569277-apiservice-cert\") pod \"packageserver-d55dfcdfc-m4dfc\" (UID: \"bc74cae8-79d9-47fd-aaf4-0bf056569277\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.442134 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc74cae8-79d9-47fd-aaf4-0bf056569277-webhook-cert\") pod \"packageserver-d55dfcdfc-m4dfc\" (UID: \"bc74cae8-79d9-47fd-aaf4-0bf056569277\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.442847 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c302d50-f83a-448d-a914-905ec04ada98-secret-volume\") pod \"collect-profiles-29332695-bh7rb\" (UID: \"3c302d50-f83a-448d-a914-905ec04ada98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.443299 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-default-certificate\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.443312 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d32e007d-c10b-46e2-8335-05743b165d1b-certs\") pod \"machine-config-server-j64fd\" (UID: \"d32e007d-c10b-46e2-8335-05743b165d1b\") " pod="openshift-machine-config-operator/machine-config-server-j64fd" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.443763 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xsxk5\" (UID: \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\") " pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.443806 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e9dedc4-5147-4980-8501-834fe5976c00-proxy-tls\") pod \"machine-config-controller-84d6567774-f6zdn\" (UID: \"6e9dedc4-5147-4980-8501-834fe5976c00\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.443982 4834 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mn7cl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.444031 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" podUID="6a567e9e-5bea-48c9-ad1e-5ed2332f0341" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.447580 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e-metrics-tls\") pod \"dns-operator-744455d44c-4xwnj\" (UID: \"933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e\") " pod="openshift-dns-operator/dns-operator-744455d44c-4xwnj" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.448814 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/faa59b6b-6fdb-47fa-9b3e-92502adc3c70-cert\") pod \"ingress-canary-sfxxw\" (UID: \"faa59b6b-6fdb-47fa-9b3e-92502adc3c70\") " pod="openshift-ingress-canary/ingress-canary-sfxxw" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.449177 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b754933-c10c-4ae3-9590-abca704681c6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7ktg9\" (UID: \"1b754933-c10c-4ae3-9590-abca704681c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.449523 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d09cf73-0239-4398-b142-17fc3892aca9-config\") pod \"service-ca-operator-777779d784-q7hzk\" (UID: \"1d09cf73-0239-4398-b142-17fc3892aca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.451840 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fe82bfed-8295-42b8-bd74-1883fd35bab6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sr6sl\" (UID: \"fe82bfed-8295-42b8-bd74-1883fd35bab6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sr6sl" Oct 08 22:25:45 crc kubenswrapper[4834]: E1008 22:25:45.458841 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:45.958769685 +0000 UTC m=+153.781654431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.459536 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1b754933-c10c-4ae3-9590-abca704681c6-images\") pod \"machine-config-operator-74547568cd-7ktg9\" (UID: \"1b754933-c10c-4ae3-9590-abca704681c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.459827 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-service-ca-bundle\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.460195 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zhdx4\" (UID: \"0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.460375 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18524e10-53d8-4201-9c5c-2564d88cfbfa-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-txrxb\" (UID: \"18524e10-53d8-4201-9c5c-2564d88cfbfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.460404 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d09cf73-0239-4398-b142-17fc3892aca9-serving-cert\") pod \"service-ca-operator-777779d784-q7hzk\" (UID: \"1d09cf73-0239-4398-b142-17fc3892aca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.460884 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f5fbb5d9-094a-4948-a1f2-2f2b84bae26d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-m5dj6\" (UID: \"f5fbb5d9-094a-4948-a1f2-2f2b84bae26d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-m5dj6" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.461586 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-metrics-certs\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.461894 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bc74cae8-79d9-47fd-aaf4-0bf056569277-tmpfs\") pod \"packageserver-d55dfcdfc-m4dfc\" (UID: \"bc74cae8-79d9-47fd-aaf4-0bf056569277\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.463232 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc74cae8-79d9-47fd-aaf4-0bf056569277-apiservice-cert\") pod \"packageserver-d55dfcdfc-m4dfc\" (UID: \"bc74cae8-79d9-47fd-aaf4-0bf056569277\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.464930 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d32e007d-c10b-46e2-8335-05743b165d1b-node-bootstrap-token\") pod \"machine-config-server-j64fd\" (UID: \"d32e007d-c10b-46e2-8335-05743b165d1b\") " pod="openshift-machine-config-operator/machine-config-server-j64fd" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.470677 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f58858fd-7852-4081-892a-c1a815e56827-signing-cabundle\") pod \"service-ca-9c57cc56f-5jgwz\" (UID: \"f58858fd-7852-4081-892a-c1a815e56827\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jgwz" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.470687 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-stats-auth\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.473860 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d-srv-cert\") pod \"olm-operator-6b444d44fb-zhdx4\" (UID: \"0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.495329 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34467af5-f30c-4d07-a6cb-9d300338f8a1-metrics-tls\") pod \"dns-default-r2fr4\" (UID: \"34467af5-f30c-4d07-a6cb-9d300338f8a1\") " pod="openshift-dns/dns-default-r2fr4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.502804 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18524e10-53d8-4201-9c5c-2564d88cfbfa-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-txrxb\" (UID: \"18524e10-53d8-4201-9c5c-2564d88cfbfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.503090 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b754933-c10c-4ae3-9590-abca704681c6-proxy-tls\") pod \"machine-config-operator-74547568cd-7ktg9\" (UID: \"1b754933-c10c-4ae3-9590-abca704681c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.503394 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a1cdcf6-51bb-46e5-a3b7-80210921f1a6-profile-collector-cert\") pod \"catalog-operator-68c6474976-66msg\" (UID: \"6a1cdcf6-51bb-46e5-a3b7-80210921f1a6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.506971 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dgdx\" (UniqueName: \"kubernetes.io/projected/f58858fd-7852-4081-892a-c1a815e56827-kube-api-access-8dgdx\") pod \"service-ca-9c57cc56f-5jgwz\" (UID: \"f58858fd-7852-4081-892a-c1a815e56827\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jgwz" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.510870 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pr2q\" (UniqueName: \"kubernetes.io/projected/0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d-kube-api-access-4pr2q\") pod \"olm-operator-6b444d44fb-zhdx4\" (UID: \"0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.517897 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-66fwq"] Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.518100 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f58858fd-7852-4081-892a-c1a815e56827-signing-key\") pod \"service-ca-9c57cc56f-5jgwz\" (UID: \"f58858fd-7852-4081-892a-c1a815e56827\") " pod="openshift-service-ca/service-ca-9c57cc56f-5jgwz" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.518432 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a1cdcf6-51bb-46e5-a3b7-80210921f1a6-srv-cert\") pod \"catalog-operator-68c6474976-66msg\" (UID: \"6a1cdcf6-51bb-46e5-a3b7-80210921f1a6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.518995 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcs58\" (UniqueName: \"kubernetes.io/projected/bc74cae8-79d9-47fd-aaf4-0bf056569277-kube-api-access-vcs58\") pod \"packageserver-d55dfcdfc-m4dfc\" (UID: \"bc74cae8-79d9-47fd-aaf4-0bf056569277\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.524794 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.538453 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5jgwz" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.541358 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:45 crc kubenswrapper[4834]: E1008 22:25:45.541469 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:46.041449892 +0000 UTC m=+153.864334638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.541994 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: E1008 22:25:45.545183 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:46.044994532 +0000 UTC m=+153.867879278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.556937 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6nkp\" (UniqueName: \"kubernetes.io/projected/933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e-kube-api-access-c6nkp\") pod \"dns-operator-744455d44c-4xwnj\" (UID: \"933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e\") " pod="openshift-dns-operator/dns-operator-744455d44c-4xwnj" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.580735 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwvbw\" (UniqueName: \"kubernetes.io/projected/c012967e-0c3a-4ce9-9c65-b0b84e3c0308-kube-api-access-gwvbw\") pod \"migrator-59844c95c7-9lpgw\" (UID: \"c012967e-0c3a-4ce9-9c65-b0b84e3c0308\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9lpgw" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.585550 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-86m4g"] Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.588090 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lzrr\" (UniqueName: \"kubernetes.io/projected/6a1cdcf6-51bb-46e5-a3b7-80210921f1a6-kube-api-access-8lzrr\") pod \"catalog-operator-68c6474976-66msg\" (UID: \"6a1cdcf6-51bb-46e5-a3b7-80210921f1a6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.588809 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-x69p6"] Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.604834 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stscm\" (UniqueName: \"kubernetes.io/projected/6e9dedc4-5147-4980-8501-834fe5976c00-kube-api-access-stscm\") pod \"machine-config-controller-84d6567774-f6zdn\" (UID: \"6e9dedc4-5147-4980-8501-834fe5976c00\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.607626 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4xwnj" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.613549 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf"] Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.623020 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74t8w\" (UniqueName: \"kubernetes.io/projected/1d09cf73-0239-4398-b142-17fc3892aca9-kube-api-access-74t8w\") pod \"service-ca-operator-777779d784-q7hzk\" (UID: \"1d09cf73-0239-4398-b142-17fc3892aca9\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.630057 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-rdz6s"] Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.640331 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv2dj\" (UniqueName: \"kubernetes.io/projected/fe82bfed-8295-42b8-bd74-1883fd35bab6-kube-api-access-jv2dj\") pod \"multus-admission-controller-857f4d67dd-sr6sl\" (UID: \"fe82bfed-8295-42b8-bd74-1883fd35bab6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sr6sl" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.643625 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:45 crc kubenswrapper[4834]: E1008 22:25:45.644135 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:46.1441154 +0000 UTC m=+153.967000146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.649635 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.656344 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzbxl\" (UniqueName: \"kubernetes.io/projected/e72cb57e-d32b-4f1c-9e1d-7fad47d553a9-kube-api-access-qzbxl\") pod \"router-default-5444994796-j595w\" (UID: \"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9\") " pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.683375 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtjld\" (UniqueName: \"kubernetes.io/projected/18524e10-53d8-4201-9c5c-2564d88cfbfa-kube-api-access-mtjld\") pod \"kube-storage-version-migrator-operator-b67b599dd-txrxb\" (UID: \"18524e10-53d8-4201-9c5c-2564d88cfbfa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.687876 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx"] Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.690940 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pw26\" (UniqueName: \"kubernetes.io/projected/3c302d50-f83a-448d-a914-905ec04ada98-kube-api-access-6pw26\") pod \"collect-profiles-29332695-bh7rb\" (UID: \"3c302d50-f83a-448d-a914-905ec04ada98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.690989 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lbmrk"] Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.711087 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nn8k8"] Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.718055 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtkg4\" (UniqueName: \"kubernetes.io/projected/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-kube-api-access-dtkg4\") pod \"marketplace-operator-79b997595-xsxk5\" (UID: \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\") " pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.721993 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gxfrf"] Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.726950 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2vxck"] Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.731204 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vbzv\" (UniqueName: \"kubernetes.io/projected/1b754933-c10c-4ae3-9590-abca704681c6-kube-api-access-5vbzv\") pod \"machine-config-operator-74547568cd-7ktg9\" (UID: \"1b754933-c10c-4ae3-9590-abca704681c6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.745237 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: E1008 22:25:45.745752 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:46.245734959 +0000 UTC m=+154.068619705 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.754444 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qms8w\" (UniqueName: \"kubernetes.io/projected/faa59b6b-6fdb-47fa-9b3e-92502adc3c70-kube-api-access-qms8w\") pod \"ingress-canary-sfxxw\" (UID: \"faa59b6b-6fdb-47fa-9b3e-92502adc3c70\") " pod="openshift-ingress-canary/ingress-canary-sfxxw" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.768835 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkczg\" (UniqueName: \"kubernetes.io/projected/f5fbb5d9-094a-4948-a1f2-2f2b84bae26d-kube-api-access-dkczg\") pod \"control-plane-machine-set-operator-78cbb6b69f-m5dj6\" (UID: \"f5fbb5d9-094a-4948-a1f2-2f2b84bae26d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-m5dj6" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.806252 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn75b\" (UniqueName: \"kubernetes.io/projected/d32e007d-c10b-46e2-8335-05743b165d1b-kube-api-access-mn75b\") pod \"machine-config-server-j64fd\" (UID: \"d32e007d-c10b-46e2-8335-05743b165d1b\") " pod="openshift-machine-config-operator/machine-config-server-j64fd" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.818674 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.840608 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l45sj\" (UniqueName: \"kubernetes.io/projected/34467af5-f30c-4d07-a6cb-9d300338f8a1-kube-api-access-l45sj\") pod \"dns-default-r2fr4\" (UID: \"34467af5-f30c-4d07-a6cb-9d300338f8a1\") " pod="openshift-dns/dns-default-r2fr4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.841296 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.846763 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:45 crc kubenswrapper[4834]: E1008 22:25:45.846974 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:46.346942406 +0000 UTC m=+154.169827152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.847029 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.847035 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" Oct 08 22:25:45 crc kubenswrapper[4834]: E1008 22:25:45.847440 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:46.34741742 +0000 UTC m=+154.170302166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.855170 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:45 crc kubenswrapper[4834]: W1008 22:25:45.856451 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod535dd049_2145_4ecc_8165_1cad8d1e1ef1.slice/crio-82742dd46b71ac1885d90aa19df354c1e796fdf461ab83f277742e368ac400cd WatchSource:0}: Error finding container 82742dd46b71ac1885d90aa19df354c1e796fdf461ab83f277742e368ac400cd: Status 404 returned error can't find the container with id 82742dd46b71ac1885d90aa19df354c1e796fdf461ab83f277742e368ac400cd Oct 08 22:25:45 crc kubenswrapper[4834]: W1008 22:25:45.863347 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e75cda3_14b7_4333_a8b8_e617fd72b6a1.slice/crio-d799a07bd0cfb05d06aeb418940717ac89497829bae322c5f5974e8db848e165 WatchSource:0}: Error finding container d799a07bd0cfb05d06aeb418940717ac89497829bae322c5f5974e8db848e165: Status 404 returned error can't find the container with id d799a07bd0cfb05d06aeb418940717ac89497829bae322c5f5974e8db848e165 Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.864067 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9lpgw" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.869581 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.888104 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.891894 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.899626 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.902720 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-sr6sl" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.919502 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sfxxw" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.932849 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-m5dj6" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.935205 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-r2fr4" Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.948177 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:45 crc kubenswrapper[4834]: E1008 22:25:45.948534 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:46.448488793 +0000 UTC m=+154.271373549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:45 crc kubenswrapper[4834]: W1008 22:25:45.952171 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode72cb57e_d32b_4f1c_9e1d_7fad47d553a9.slice/crio-774b89c378298c78e14aabb7d8953d7940953ae2200c7740b177173042998bcd WatchSource:0}: Error finding container 774b89c378298c78e14aabb7d8953d7940953ae2200c7740b177173042998bcd: Status 404 returned error can't find the container with id 774b89c378298c78e14aabb7d8953d7940953ae2200c7740b177173042998bcd Oct 08 22:25:45 crc kubenswrapper[4834]: I1008 22:25:45.957961 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-j64fd" Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.052192 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:46 crc kubenswrapper[4834]: E1008 22:25:46.052625 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:46.552608081 +0000 UTC m=+154.375492827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.070915 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.092216 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.092217 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cdxtt" podStartSLOduration=127.092192352 podStartE2EDuration="2m7.092192352s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:46.092108799 +0000 UTC m=+153.914993545" watchObservedRunningTime="2025-10-08 22:25:46.092192352 +0000 UTC m=+153.915077098" Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.115209 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4v5sm"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.127993 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.151507 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.156010 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:46 crc kubenswrapper[4834]: E1008 22:25:46.156674 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:46.656627758 +0000 UTC m=+154.479512504 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.167749 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.262039 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:46 crc kubenswrapper[4834]: E1008 22:25:46.262922 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:46.762900726 +0000 UTC m=+154.585785472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.329350 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sr6sl"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.346355 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4xwnj"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.347692 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.363200 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:46 crc kubenswrapper[4834]: E1008 22:25:46.363314 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:46.863293291 +0000 UTC m=+154.686178037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.363640 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:46 crc kubenswrapper[4834]: E1008 22:25:46.363960 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:46.86395315 +0000 UTC m=+154.686837896 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.381950 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sfxxw"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.389945 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" podStartSLOduration=127.389925448 podStartE2EDuration="2m7.389925448s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:46.38574329 +0000 UTC m=+154.208628036" watchObservedRunningTime="2025-10-08 22:25:46.389925448 +0000 UTC m=+154.212810184" Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.423659 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.443531 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" event={"ID":"535dd049-2145-4ecc-8165-1cad8d1e1ef1","Type":"ContainerStarted","Data":"82742dd46b71ac1885d90aa19df354c1e796fdf461ab83f277742e368ac400cd"} Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.444529 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-86m4g" event={"ID":"084bae12-5db3-49bc-b703-a694b692c215","Type":"ContainerStarted","Data":"241f607b1ba7238ce374047ba5f7a0d720176714e3327445771551fad1bde33f"} Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.446286 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lbmrk" event={"ID":"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19","Type":"ContainerStarted","Data":"4e740e33c7a64d475a0325275c391b4a2d90c7eb9e572b001e2b08face1a1518"} Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.457564 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.463076 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" event={"ID":"95df5fe3-a6c2-4af6-94af-5e0540baf2a6","Type":"ContainerStarted","Data":"8541185833c92a2538897b2259b80d718678e4e0790275edde61e4212ae5ca0f"} Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.466726 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:46 crc kubenswrapper[4834]: E1008 22:25:46.467057 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:46.967038379 +0000 UTC m=+154.789923125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.470631 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" event={"ID":"b42878da-c62e-425b-b147-57836dcd9a2d","Type":"ContainerStarted","Data":"fb003bad25e3c5003be5a19b9dcdbe63c7663c2dc1bd362d7027823da72f5c16"} Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.470699 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" event={"ID":"b42878da-c62e-425b-b147-57836dcd9a2d","Type":"ContainerStarted","Data":"19c5e36ba29a604dba5f1766f32602a4de0f63265ab5914dec340285a0bc78a2"} Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.471931 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.473395 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx" event={"ID":"7e8cf764-332f-4e89-ad19-6dad90f92692","Type":"ContainerStarted","Data":"82bfeb6fbecd176ee64cbecc635cd3f4b05fe35b6ad60183d069ebad8a750426"} Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.475817 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" event={"ID":"7c2cc088-ced0-4542-878b-48488976518a","Type":"ContainerStarted","Data":"8aa8aa55a52c8a6d0fafb762ef0895dc6b3b08efcef1d92511ebe1b7d4fae871"} Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.483038 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5jgwz"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.486184 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-x69p6" event={"ID":"d832b5a7-1a3a-4da8-8524-6bdf31d4fd94","Type":"ContainerStarted","Data":"d69e002eefcb934df32e6fbe509ddce4c13ad7e1325981932aba6bfcc961ea54"} Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.489569 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q" event={"ID":"ea3d207c-4b27-4342-964f-80f5039fa7f6","Type":"ContainerStarted","Data":"fcf1c50d03bcfe98ebfeb95846ebddcaa7b0e4f0653d820737eebbcb62ee1fb2"} Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.490497 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" event={"ID":"5e75cda3-14b7-4333-a8b8-e617fd72b6a1","Type":"ContainerStarted","Data":"d799a07bd0cfb05d06aeb418940717ac89497829bae322c5f5974e8db848e165"} Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.491754 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qqn48" event={"ID":"77239a2f-ad60-4314-8d76-87449351907a","Type":"ContainerStarted","Data":"ed51e70b243aea57a62a78dc3c4846ecd24ba11174757bc930d64059f42bb70b"} Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.494794 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-j595w" event={"ID":"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9","Type":"ContainerStarted","Data":"774b89c378298c78e14aabb7d8953d7940953ae2200c7740b177173042998bcd"} Oct 08 22:25:46 crc kubenswrapper[4834]: W1008 22:25:46.494985 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6a0fdcf_b763_4509_8604_17ed927f48a2.slice/crio-528443356e2e778bc10921e3681b87de3c62a94ebb5a35ea5734e1998755a50f WatchSource:0}: Error finding container 528443356e2e778bc10921e3681b87de3c62a94ebb5a35ea5734e1998755a50f: Status 404 returned error can't find the container with id 528443356e2e778bc10921e3681b87de3c62a94ebb5a35ea5734e1998755a50f Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.499574 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" event={"ID":"c9182dd7-0f3e-4916-b53e-0a9de9fffa89","Type":"ContainerStarted","Data":"edd46dca9f6fe07f8d137d75cb7f22fd5531d7252f09475e76bb7e8cb7fbb307"} Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.507497 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" event={"ID":"936e07e4-586f-4092-ad38-4e49512485a5","Type":"ContainerStarted","Data":"4735b480d0ec03f327bc31ca09fb2db92a78eca1fd4db4cb69945d69e2836997"} Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.510409 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" event={"ID":"ec174061-c43b-4749-8684-500ce8aaea32","Type":"ContainerStarted","Data":"8e20da096c3da09b9f3a28d2f5458a83b6cf8cf385afa500ca125f5ba25f8a37"} Oct 08 22:25:46 crc kubenswrapper[4834]: W1008 22:25:46.521284 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe82bfed_8295_42b8_bd74_1883fd35bab6.slice/crio-c85bc65abaaa8cec95c92750054314d88690f2db11761d2ff12d4ac0146eab57 WatchSource:0}: Error finding container c85bc65abaaa8cec95c92750054314d88690f2db11761d2ff12d4ac0146eab57: Status 404 returned error can't find the container with id c85bc65abaaa8cec95c92750054314d88690f2db11761d2ff12d4ac0146eab57 Oct 08 22:25:46 crc kubenswrapper[4834]: W1008 22:25:46.532135 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e8c92dc_69ee_4e97_a5a3_d1c1bd7d4a7d.slice/crio-b552e9a920bda798c2ced068c1e57990b7b5a47bd0dd96840cbdea7185f09b89 WatchSource:0}: Error finding container b552e9a920bda798c2ced068c1e57990b7b5a47bd0dd96840cbdea7185f09b89: Status 404 returned error can't find the container with id b552e9a920bda798c2ced068c1e57990b7b5a47bd0dd96840cbdea7185f09b89 Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.569181 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:46 crc kubenswrapper[4834]: E1008 22:25:46.569547 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:47.069533942 +0000 UTC m=+154.892418688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:46 crc kubenswrapper[4834]: W1008 22:25:46.570292 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf58858fd_7852_4081_892a_c1a815e56827.slice/crio-deb4d350f6f0d72180970eef2e225ad7e9aaab220bf99454f1aa9c720d3dea83 WatchSource:0}: Error finding container deb4d350f6f0d72180970eef2e225ad7e9aaab220bf99454f1aa9c720d3dea83: Status 404 returned error can't find the container with id deb4d350f6f0d72180970eef2e225ad7e9aaab220bf99454f1aa9c720d3dea83 Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.574679 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-m5dj6"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.673200 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:46 crc kubenswrapper[4834]: E1008 22:25:46.674520 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:47.174488685 +0000 UTC m=+154.997373431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:46 crc kubenswrapper[4834]: W1008 22:25:46.688852 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd32e007d_c10b_46e2_8335_05743b165d1b.slice/crio-f2f9b2c3b5130d4c595ac069178b04187b30aeea06b5e9aae31209e8435f449b WatchSource:0}: Error finding container f2f9b2c3b5130d4c595ac069178b04187b30aeea06b5e9aae31209e8435f449b: Status 404 returned error can't find the container with id f2f9b2c3b5130d4c595ac069178b04187b30aeea06b5e9aae31209e8435f449b Oct 08 22:25:46 crc kubenswrapper[4834]: W1008 22:25:46.711176 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5fbb5d9_094a_4948_a1f2_2f2b84bae26d.slice/crio-43daa0f515dc2ae77097a23ecd32c414097a6955d85c667b27c42ea1fc6d3c6e WatchSource:0}: Error finding container 43daa0f515dc2ae77097a23ecd32c414097a6955d85c667b27c42ea1fc6d3c6e: Status 404 returned error can't find the container with id 43daa0f515dc2ae77097a23ecd32c414097a6955d85c667b27c42ea1fc6d3c6e Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.776187 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:46 crc kubenswrapper[4834]: E1008 22:25:46.777635 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:47.277614276 +0000 UTC m=+155.100499022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.880367 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:46 crc kubenswrapper[4834]: E1008 22:25:46.880825 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:47.380809089 +0000 UTC m=+155.203693825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.890200 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.969114 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn"] Oct 08 22:25:46 crc kubenswrapper[4834]: I1008 22:25:46.984775 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:46 crc kubenswrapper[4834]: E1008 22:25:46.985193 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:47.485174544 +0000 UTC m=+155.308059290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.014559 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.027760 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.027842 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.042279 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xsxk5"] Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.087540 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:47 crc kubenswrapper[4834]: E1008 22:25:47.088038 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:47.588016138 +0000 UTC m=+155.410900884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.117920 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" podStartSLOduration=128.117889655 podStartE2EDuration="2m8.117889655s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:47.072668827 +0000 UTC m=+154.895553573" watchObservedRunningTime="2025-10-08 22:25:47.117889655 +0000 UTC m=+154.940774401" Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.159541 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-r2fr4"] Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.168766 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb"] Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.175387 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9lpgw"] Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.189875 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:47 crc kubenswrapper[4834]: E1008 22:25:47.190414 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:47.690393638 +0000 UTC m=+155.513278394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:47 crc kubenswrapper[4834]: W1008 22:25:47.198830 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e9dedc4_5147_4980_8501_834fe5976c00.slice/crio-51e1337bf2068bae3a9f2cd5570205d687a0298570c4606b3fb05e153559d45e WatchSource:0}: Error finding container 51e1337bf2068bae3a9f2cd5570205d687a0298570c4606b3fb05e153559d45e: Status 404 returned error can't find the container with id 51e1337bf2068bae3a9f2cd5570205d687a0298570c4606b3fb05e153559d45e Oct 08 22:25:47 crc kubenswrapper[4834]: W1008 22:25:47.227832 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34467af5_f30c_4d07_a6cb_9d300338f8a1.slice/crio-3129790fc6f8a943b24c0d75cd274b5db6631c45cac715740bd3476e8409ebc6 WatchSource:0}: Error finding container 3129790fc6f8a943b24c0d75cd274b5db6631c45cac715740bd3476e8409ebc6: Status 404 returned error can't find the container with id 3129790fc6f8a943b24c0d75cd274b5db6631c45cac715740bd3476e8409ebc6 Oct 08 22:25:47 crc kubenswrapper[4834]: W1008 22:25:47.244043 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18524e10_53d8_4201_9c5c_2564d88cfbfa.slice/crio-14da1e330b34d7d3ee2b2628ec2b7c9872ad800e15199908784962fc1db9040a WatchSource:0}: Error finding container 14da1e330b34d7d3ee2b2628ec2b7c9872ad800e15199908784962fc1db9040a: Status 404 returned error can't find the container with id 14da1e330b34d7d3ee2b2628ec2b7c9872ad800e15199908784962fc1db9040a Oct 08 22:25:47 crc kubenswrapper[4834]: W1008 22:25:47.244941 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc012967e_0c3a_4ce9_9c65_b0b84e3c0308.slice/crio-6ef5bbc08017b54ee2e791efa1fea753823b4dee9e5b29feb42fdaf0b9f66471 WatchSource:0}: Error finding container 6ef5bbc08017b54ee2e791efa1fea753823b4dee9e5b29feb42fdaf0b9f66471: Status 404 returned error can't find the container with id 6ef5bbc08017b54ee2e791efa1fea753823b4dee9e5b29feb42fdaf0b9f66471 Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.291903 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:47 crc kubenswrapper[4834]: E1008 22:25:47.292100 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:47.792044626 +0000 UTC m=+155.614929382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.292322 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:47 crc kubenswrapper[4834]: E1008 22:25:47.292988 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:47.792967163 +0000 UTC m=+155.615851909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.326503 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9"] Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.391109 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mdd58" podStartSLOduration=128.391087873 podStartE2EDuration="2m8.391087873s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:47.387854103 +0000 UTC m=+155.210738849" watchObservedRunningTime="2025-10-08 22:25:47.391087873 +0000 UTC m=+155.213972619" Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.393932 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:47 crc kubenswrapper[4834]: E1008 22:25:47.394401 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:47.894379816 +0000 UTC m=+155.717264562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.494864 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:47 crc kubenswrapper[4834]: E1008 22:25:47.495310 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:47.995290245 +0000 UTC m=+155.818174991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.529567 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" event={"ID":"bc74cae8-79d9-47fd-aaf4-0bf056569277","Type":"ContainerStarted","Data":"18c617013dd174d4344ac8dae25079efa8d3fc96ad19a834203d4162ba97f9dd"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.533593 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" event={"ID":"0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d","Type":"ContainerStarted","Data":"b552e9a920bda798c2ced068c1e57990b7b5a47bd0dd96840cbdea7185f09b89"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.534949 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" event={"ID":"6e9dedc4-5147-4980-8501-834fe5976c00","Type":"ContainerStarted","Data":"51e1337bf2068bae3a9f2cd5570205d687a0298570c4606b3fb05e153559d45e"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.543998 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r2fr4" event={"ID":"34467af5-f30c-4d07-a6cb-9d300338f8a1","Type":"ContainerStarted","Data":"3129790fc6f8a943b24c0d75cd274b5db6631c45cac715740bd3476e8409ebc6"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.546619 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" event={"ID":"52cf1dbe-b7f9-44be-bc44-1308a5eb0471","Type":"ContainerStarted","Data":"811698e2b7c0e80996357578420c2866beeccdf11593e2fdc798ee40d6756a7d"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.547929 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" event={"ID":"1b754933-c10c-4ae3-9590-abca704681c6","Type":"ContainerStarted","Data":"533609a84cf6f783d953932228814ef5038e8dd60e5ebd5b8ca758c8b94b9542"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.549499 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" event={"ID":"6a1cdcf6-51bb-46e5-a3b7-80210921f1a6","Type":"ContainerStarted","Data":"0586f4d79cddd3ac2a87993fb02e27879a41cdc38d482276d404c0d636ae7551"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.550924 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5jgwz" event={"ID":"f58858fd-7852-4081-892a-c1a815e56827","Type":"ContainerStarted","Data":"deb4d350f6f0d72180970eef2e225ad7e9aaab220bf99454f1aa9c720d3dea83"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.552295 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7" event={"ID":"f6a0fdcf-b763-4509-8604-17ed927f48a2","Type":"ContainerStarted","Data":"528443356e2e778bc10921e3681b87de3c62a94ebb5a35ea5734e1998755a50f"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.554019 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" event={"ID":"80186c7b-b4a2-4480-a605-18eafe6067fb","Type":"ContainerStarted","Data":"48606480ec7e3c21a79006cea6933fd14c01483800ed3864b79bfa1113594ef8"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.557399 4834 generic.go:334] "Generic (PLEG): container finished" podID="77239a2f-ad60-4314-8d76-87449351907a" containerID="987a119d73ac7bb8b80ac737f2b0c0e31890d73462be1565526b25ad1d07bee8" exitCode=0 Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.563996 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9lpgw" event={"ID":"c012967e-0c3a-4ce9-9c65-b0b84e3c0308","Type":"ContainerStarted","Data":"6ef5bbc08017b54ee2e791efa1fea753823b4dee9e5b29feb42fdaf0b9f66471"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.564035 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qqn48" event={"ID":"77239a2f-ad60-4314-8d76-87449351907a","Type":"ContainerDied","Data":"987a119d73ac7bb8b80ac737f2b0c0e31890d73462be1565526b25ad1d07bee8"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.564611 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" event={"ID":"3c302d50-f83a-448d-a914-905ec04ada98","Type":"ContainerStarted","Data":"ec42d2c109c1c60c0ebe472037628bb6b223cd8f16bb23ed3cbcd04ef097dad7"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.566909 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk" event={"ID":"1d09cf73-0239-4398-b142-17fc3892aca9","Type":"ContainerStarted","Data":"4e0ba8ce98d5407005650b63c9c7b0de6849b835dc6517b24baa5163085b389c"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.568438 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx" event={"ID":"7e8cf764-332f-4e89-ad19-6dad90f92692","Type":"ContainerStarted","Data":"e7d03062ce2516d105eb9bde507ec9de92edcdd7dc5a579d2945c04c3861b1bd"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.569511 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-j64fd" event={"ID":"d32e007d-c10b-46e2-8335-05743b165d1b","Type":"ContainerStarted","Data":"f2f9b2c3b5130d4c595ac069178b04187b30aeea06b5e9aae31209e8435f449b"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.570949 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" event={"ID":"5496ae0a-5098-49eb-9a39-82e4d0c584bf","Type":"ContainerStarted","Data":"97f4d510d7a868e9a6aaecce0aab4f2d0911cda716a7ae4c013a4fd38a349287"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.571881 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-m5dj6" event={"ID":"f5fbb5d9-094a-4948-a1f2-2f2b84bae26d","Type":"ContainerStarted","Data":"43daa0f515dc2ae77097a23ecd32c414097a6955d85c667b27c42ea1fc6d3c6e"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.572938 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq" event={"ID":"b1feb4b4-f628-4b37-9cb9-0e01269f5825","Type":"ContainerStarted","Data":"4bb09ead44ff25003cb218c9b44dcd1e12c52b52fc5fc0cbf8b5b905381e3c34"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.575271 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sr6sl" event={"ID":"fe82bfed-8295-42b8-bd74-1883fd35bab6","Type":"ContainerStarted","Data":"c85bc65abaaa8cec95c92750054314d88690f2db11761d2ff12d4ac0146eab57"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.576880 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-86m4g" event={"ID":"084bae12-5db3-49bc-b703-a694b692c215","Type":"ContainerStarted","Data":"cf0501234783ba0bf8cb4f68dfe1c24c8765335b6f5c030f49a461f3aea2dfca"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.578047 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p" event={"ID":"b45473af-e174-4767-91d1-317e402f20a2","Type":"ContainerStarted","Data":"175ef3e04682161a62690c615e0c2091d6b988e688570f9d870e764ceb3c722f"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.579684 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-x69p6" event={"ID":"d832b5a7-1a3a-4da8-8524-6bdf31d4fd94","Type":"ContainerStarted","Data":"4f1b3e41dba058781d39c6bd870b86812da77f13d78061cba46482ad929f0ba0"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.581355 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" event={"ID":"535dd049-2145-4ecc-8165-1cad8d1e1ef1","Type":"ContainerStarted","Data":"e86b25513f083001f0fef93189604d0f0685c4291db48924381bf2c4a07f8b36"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.595713 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4xwnj" event={"ID":"933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e","Type":"ContainerStarted","Data":"9b2ccbb9c35a80933fb2f436e62fd0550f2fe8d351fb6acf4f411374d341cc51"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.598282 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:47 crc kubenswrapper[4834]: E1008 22:25:47.598433 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:48.098396785 +0000 UTC m=+155.921281541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.598897 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sfxxw" event={"ID":"faa59b6b-6fdb-47fa-9b3e-92502adc3c70","Type":"ContainerStarted","Data":"e505131536e863665bf96a1cbc53e01a96208cadf18574dc0a49a81a121501c1"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.599473 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:47 crc kubenswrapper[4834]: E1008 22:25:47.599886 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:48.099866876 +0000 UTC m=+155.922751622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.600248 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb" event={"ID":"18524e10-53d8-4201-9c5c-2564d88cfbfa","Type":"ContainerStarted","Data":"14da1e330b34d7d3ee2b2628ec2b7c9872ad800e15199908784962fc1db9040a"} Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.622993 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nhm6q" podStartSLOduration=128.622952904 podStartE2EDuration="2m8.622952904s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:47.615827424 +0000 UTC m=+155.438712170" watchObservedRunningTime="2025-10-08 22:25:47.622952904 +0000 UTC m=+155.445837660" Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.708070 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:47 crc kubenswrapper[4834]: E1008 22:25:47.709555 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:48.20953557 +0000 UTC m=+156.032420316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.809589 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:47 crc kubenswrapper[4834]: E1008 22:25:47.810095 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:48.310070518 +0000 UTC m=+156.132955254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.911713 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:47 crc kubenswrapper[4834]: E1008 22:25:47.911950 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:48.411912544 +0000 UTC m=+156.234797300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:47 crc kubenswrapper[4834]: I1008 22:25:47.912018 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:47 crc kubenswrapper[4834]: E1008 22:25:47.912643 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:48.412624754 +0000 UTC m=+156.235509520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.013453 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:48 crc kubenswrapper[4834]: E1008 22:25:48.014490 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:48.514447448 +0000 UTC m=+156.337332234 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.015873 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:48 crc kubenswrapper[4834]: E1008 22:25:48.016550 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:48.516528447 +0000 UTC m=+156.339413223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.118234 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:48 crc kubenswrapper[4834]: E1008 22:25:48.118660 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:48.618642649 +0000 UTC m=+156.441527395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.222699 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:48 crc kubenswrapper[4834]: E1008 22:25:48.223682 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:48.723659383 +0000 UTC m=+156.546544129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.324074 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:48 crc kubenswrapper[4834]: E1008 22:25:48.324296 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:48.824263034 +0000 UTC m=+156.647147780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.324547 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:48 crc kubenswrapper[4834]: E1008 22:25:48.325193 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:48.825183529 +0000 UTC m=+156.648068275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.428937 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:48 crc kubenswrapper[4834]: E1008 22:25:48.429337 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:48.929315599 +0000 UTC m=+156.752200345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.531410 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:48 crc kubenswrapper[4834]: E1008 22:25:48.532170 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:49.032156331 +0000 UTC m=+156.855041077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.612069 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-m5dj6" event={"ID":"f5fbb5d9-094a-4948-a1f2-2f2b84bae26d","Type":"ContainerStarted","Data":"af4a3e554471faf0e8df7c54059be51529b7e18a95a97e220df1d05c8352f7dc"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.633439 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:48 crc kubenswrapper[4834]: E1008 22:25:48.633822 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:49.133802921 +0000 UTC m=+156.956687667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.641355 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sr6sl" event={"ID":"fe82bfed-8295-42b8-bd74-1883fd35bab6","Type":"ContainerStarted","Data":"7dfa600970ce17a4c6b06a5516149425c480cc5f15c7750d89e699aee645ae00"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.645977 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-j64fd" event={"ID":"d32e007d-c10b-46e2-8335-05743b165d1b","Type":"ContainerStarted","Data":"672153957ca254fb68a1821419b0680774f37fc68864227fb3fe80351ddb0b83"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.649174 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" event={"ID":"52cf1dbe-b7f9-44be-bc44-1308a5eb0471","Type":"ContainerStarted","Data":"7d5966b68c73f70620059e69b283bf0edd5c22f4ef23211cb6e273c130a674da"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.651575 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5jgwz" event={"ID":"f58858fd-7852-4081-892a-c1a815e56827","Type":"ContainerStarted","Data":"c427dffecc4ea8f6b424d73535c8ca34fc83ec1ea68bf15a473b605e32b7cb48"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.654026 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" event={"ID":"bc74cae8-79d9-47fd-aaf4-0bf056569277","Type":"ContainerStarted","Data":"ab4eed9d0272275cf6c54cd7b8c99785a3f5b777ceca8e2d1a3e0b3d2a247aa9"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.654309 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.656401 4834 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-m4dfc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:5443/healthz\": dial tcp 10.217.0.26:5443: connect: connection refused" start-of-body= Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.656474 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" podUID="bc74cae8-79d9-47fd-aaf4-0bf056569277" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.26:5443/healthz\": dial tcp 10.217.0.26:5443: connect: connection refused" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.663964 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" event={"ID":"0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d","Type":"ContainerStarted","Data":"cfcb389d9e770c07ea1c93a523aceb0d984baec9929df6d6b054a1e702b70ed0"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.664212 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.666553 4834 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zhdx4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.666604 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" podUID="0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.667100 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5jgwz" podStartSLOduration=129.667086624 podStartE2EDuration="2m9.667086624s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:48.666879768 +0000 UTC m=+156.489764514" watchObservedRunningTime="2025-10-08 22:25:48.667086624 +0000 UTC m=+156.489971360" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.673659 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r2fr4" event={"ID":"34467af5-f30c-4d07-a6cb-9d300338f8a1","Type":"ContainerStarted","Data":"6733e8bcf07ad2e3bfe5257f8a5066f159e108e54cead012d0fef086d4348ae5"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.684575 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4xwnj" event={"ID":"933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e","Type":"ContainerStarted","Data":"07c9b02b00d3d243b7a38dee38499282d04f9a518041bde3e1199af768760c3b"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.694348 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-j595w" event={"ID":"e72cb57e-d32b-4f1c-9e1d-7fad47d553a9","Type":"ContainerStarted","Data":"c0f6b9daab0db62509c6d1ec2e8b7bfd46b0cea950f0980f40322e1cb8afa662"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.701749 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" event={"ID":"5e75cda3-14b7-4333-a8b8-e617fd72b6a1","Type":"ContainerStarted","Data":"cdca9bd1144de7bb676a9e927a4731ca135240f702e032b3a97810dd1dddb95c"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.706521 4834 generic.go:334] "Generic (PLEG): container finished" podID="936e07e4-586f-4092-ad38-4e49512485a5" containerID="deaf89d50c7500084331ec67e4ec2466ea1347bbdb5a009ebed7ab5dffd8fc20" exitCode=0 Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.707310 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" event={"ID":"936e07e4-586f-4092-ad38-4e49512485a5","Type":"ContainerDied","Data":"deaf89d50c7500084331ec67e4ec2466ea1347bbdb5a009ebed7ab5dffd8fc20"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.711273 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" event={"ID":"ec174061-c43b-4749-8684-500ce8aaea32","Type":"ContainerStarted","Data":"59c8c621ff974dc2bbb307c3ff7d47370ec63710753b16e531d75ff93f753303"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.719064 4834 generic.go:334] "Generic (PLEG): container finished" podID="80186c7b-b4a2-4480-a605-18eafe6067fb" containerID="99e7327255365d0073a67d910d2d795d151e6bdea37d83e96465e9c5cdb132ec" exitCode=0 Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.719176 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" event={"ID":"80186c7b-b4a2-4480-a605-18eafe6067fb","Type":"ContainerDied","Data":"99e7327255365d0073a67d910d2d795d151e6bdea37d83e96465e9c5cdb132ec"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.729336 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" event={"ID":"5496ae0a-5098-49eb-9a39-82e4d0c584bf","Type":"ContainerStarted","Data":"b37624f0757bb446b6a7c569534fd9c58da95bfe6b70e3b620dcfb64bb9a69ce"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.730136 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.731667 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" podStartSLOduration=129.731644734 podStartE2EDuration="2m9.731644734s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:48.72974757 +0000 UTC m=+156.552632326" watchObservedRunningTime="2025-10-08 22:25:48.731644734 +0000 UTC m=+156.554529480" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.731842 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" podStartSLOduration=129.731835109 podStartE2EDuration="2m9.731835109s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:48.698320299 +0000 UTC m=+156.521205045" watchObservedRunningTime="2025-10-08 22:25:48.731835109 +0000 UTC m=+156.554719855" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.731898 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7" event={"ID":"f6a0fdcf-b763-4509-8604-17ed927f48a2","Type":"ContainerStarted","Data":"1f0451d97919285e7f1f13a6ef2d6f7f34c96066a6a83aced26ee04c1a68d1e4"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.732076 4834 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-4v5sm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.732122 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" podUID="5496ae0a-5098-49eb-9a39-82e4d0c584bf" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.734628 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.734780 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" event={"ID":"6a1cdcf6-51bb-46e5-a3b7-80210921f1a6","Type":"ContainerStarted","Data":"011bf0ca65df23ac212dd5b7df1422d888eccc15c9e85eb2f5d03dec7d8203ea"} Oct 08 22:25:48 crc kubenswrapper[4834]: E1008 22:25:48.735041 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:49.235025028 +0000 UTC m=+157.057909774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.737639 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.738244 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9lpgw" event={"ID":"c012967e-0c3a-4ce9-9c65-b0b84e3c0308","Type":"ContainerStarted","Data":"2ac3a23de4638c56962cd11614559162785a5fbcd6640cfd23b80249a8671f90"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.742306 4834 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-66msg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.742365 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" podUID="6a1cdcf6-51bb-46e5-a3b7-80210921f1a6" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.747857 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" event={"ID":"3c302d50-f83a-448d-a914-905ec04ada98","Type":"ContainerStarted","Data":"d60f771dc7d8aa571922d2bdcb0cccf4509c88a3299a79da67ff81630f1770ca"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.757023 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" event={"ID":"95df5fe3-a6c2-4af6-94af-5e0540baf2a6","Type":"ContainerStarted","Data":"4f03ebbdc5f837488683598e62638c30eddd26414fd5b65f3bdd27a13e63363e"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.759115 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-j595w" podStartSLOduration=129.759084572 podStartE2EDuration="2m9.759084572s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:48.758495536 +0000 UTC m=+156.581380282" watchObservedRunningTime="2025-10-08 22:25:48.759084572 +0000 UTC m=+156.581969318" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.780793 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sfxxw" event={"ID":"faa59b6b-6fdb-47fa-9b3e-92502adc3c70","Type":"ContainerStarted","Data":"0b6ae434695cda0775f83b8d3eb1b54f0c5b2f035bb7d1bbf098e6f3a20c836a"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.787213 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq" event={"ID":"b1feb4b4-f628-4b37-9cb9-0e01269f5825","Type":"ContainerStarted","Data":"0d14273c3ae174849aa0dc25854117e43cef536ea412cb8e094ea9c7808e1a21"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.789244 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lbmrk" event={"ID":"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19","Type":"ContainerStarted","Data":"cb0e1c9c31bce6e93b57ec5d97399411a4443fe383179666e9a4287f7b0f9940"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.792415 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" event={"ID":"c9182dd7-0f3e-4916-b53e-0a9de9fffa89","Type":"ContainerStarted","Data":"6899aa93d3a413f97657ed4c04a801b83280efa3433694ade6ceb9353f8612b4"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.795204 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb" event={"ID":"18524e10-53d8-4201-9c5c-2564d88cfbfa","Type":"ContainerStarted","Data":"5deede1a84e5041b505deecb49e769c64942fd534efe6e22a32207680a9c8f03"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.796972 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" event={"ID":"b42878da-c62e-425b-b147-57836dcd9a2d","Type":"ContainerStarted","Data":"0575a0b18c15261c833590482eb4f46c3f03d09459373bdd6e6548c49ef25d0b"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.798542 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p" event={"ID":"b45473af-e174-4767-91d1-317e402f20a2","Type":"ContainerStarted","Data":"46a9b674042066cd929398db1bbb67ceaa3d5004a57cbdcea13fc928c947f214"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.801028 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" event={"ID":"6e9dedc4-5147-4980-8501-834fe5976c00","Type":"ContainerStarted","Data":"34789c16d962e7b5934ca04fb2ec45ddc8589159ac175784fdcf1bf101a340ee"} Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.801101 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-86m4g" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.802851 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.803315 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.803440 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.806701 4834 patch_prober.go:28] interesting pod/console-operator-58897d9998-x69p6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.806836 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-x69p6" podUID="d832b5a7-1a3a-4da8-8524-6bdf31d4fd94" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.815337 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" podStartSLOduration=129.815315049 podStartE2EDuration="2m9.815315049s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:48.815135104 +0000 UTC m=+156.638019850" watchObservedRunningTime="2025-10-08 22:25:48.815315049 +0000 UTC m=+156.638199785" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.836621 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:48 crc kubenswrapper[4834]: E1008 22:25:48.839260 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:49.339224379 +0000 UTC m=+157.162109135 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.858936 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.936071 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.936129 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.939415 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-rdz6s" podStartSLOduration=129.939396037 podStartE2EDuration="2m9.939396037s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:48.849193909 +0000 UTC m=+156.672078675" watchObservedRunningTime="2025-10-08 22:25:48.939396037 +0000 UTC m=+156.762280783" Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.939786 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:48 crc kubenswrapper[4834]: E1008 22:25:48.943957 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:49.443934424 +0000 UTC m=+157.266819170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:48 crc kubenswrapper[4834]: I1008 22:25:48.983600 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-nn8k8" podStartSLOduration=129.983569106 podStartE2EDuration="2m9.983569106s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:48.979501262 +0000 UTC m=+156.802386008" watchObservedRunningTime="2025-10-08 22:25:48.983569106 +0000 UTC m=+156.806453852" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.032328 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-txrxb" podStartSLOduration=130.032298801 podStartE2EDuration="2m10.032298801s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.021793717 +0000 UTC m=+156.844678463" watchObservedRunningTime="2025-10-08 22:25:49.032298801 +0000 UTC m=+156.855183547" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.045927 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:49 crc kubenswrapper[4834]: E1008 22:25:49.060414 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:49.560370108 +0000 UTC m=+157.383254854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.130415 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-lbmrk" podStartSLOduration=130.130384621 podStartE2EDuration="2m10.130384621s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.12924484 +0000 UTC m=+156.952129586" watchObservedRunningTime="2025-10-08 22:25:49.130384621 +0000 UTC m=+156.953269367" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.130748 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-d7fsf" podStartSLOduration=130.130742782 podStartE2EDuration="2m10.130742782s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.085126433 +0000 UTC m=+156.908011179" watchObservedRunningTime="2025-10-08 22:25:49.130742782 +0000 UTC m=+156.953627528" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.162964 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:49 crc kubenswrapper[4834]: E1008 22:25:49.163474 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:49.663457549 +0000 UTC m=+157.486342295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.191566 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wk5x7" podStartSLOduration=130.191543896 podStartE2EDuration="2m10.191543896s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.165770554 +0000 UTC m=+156.988655300" watchObservedRunningTime="2025-10-08 22:25:49.191543896 +0000 UTC m=+157.014428632" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.191849 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-x69p6" podStartSLOduration=130.191844254 podStartE2EDuration="2m10.191844254s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.191160575 +0000 UTC m=+157.014045321" watchObservedRunningTime="2025-10-08 22:25:49.191844254 +0000 UTC m=+157.014729000" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.211825 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kdqjq" podStartSLOduration=130.211791954 podStartE2EDuration="2m10.211791954s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.21132254 +0000 UTC m=+157.034207296" watchObservedRunningTime="2025-10-08 22:25:49.211791954 +0000 UTC m=+157.034676700" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.235236 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-4gjq8" podStartSLOduration=130.23521016 podStartE2EDuration="2m10.23521016s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.232283468 +0000 UTC m=+157.055168224" watchObservedRunningTime="2025-10-08 22:25:49.23521016 +0000 UTC m=+157.058094916" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.265181 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:49 crc kubenswrapper[4834]: E1008 22:25:49.265502 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:49.765485269 +0000 UTC m=+157.588370015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.322217 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sfxxw" podStartSLOduration=7.322197498 podStartE2EDuration="7.322197498s" podCreationTimestamp="2025-10-08 22:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.290614513 +0000 UTC m=+157.113499269" watchObservedRunningTime="2025-10-08 22:25:49.322197498 +0000 UTC m=+157.145082244" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.322953 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-66fwq" podStartSLOduration=130.32294843 podStartE2EDuration="2m10.32294843s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.32153036 +0000 UTC m=+157.144415106" watchObservedRunningTime="2025-10-08 22:25:49.32294843 +0000 UTC m=+157.145833176" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.370020 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:49 crc kubenswrapper[4834]: E1008 22:25:49.370476 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:49.870459262 +0000 UTC m=+157.693344008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.413362 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j8g6p" podStartSLOduration=130.413342964 podStartE2EDuration="2m10.413342964s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.412431798 +0000 UTC m=+157.235316554" watchObservedRunningTime="2025-10-08 22:25:49.413342964 +0000 UTC m=+157.236227710" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.421726 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-86m4g" podStartSLOduration=130.421704328 podStartE2EDuration="2m10.421704328s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.379211347 +0000 UTC m=+157.202096093" watchObservedRunningTime="2025-10-08 22:25:49.421704328 +0000 UTC m=+157.244589074" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.471682 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:49 crc kubenswrapper[4834]: E1008 22:25:49.472158 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:49.972123651 +0000 UTC m=+157.795008397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.478268 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" podStartSLOduration=130.478252183 podStartE2EDuration="2m10.478252183s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.478195962 +0000 UTC m=+157.301080728" watchObservedRunningTime="2025-10-08 22:25:49.478252183 +0000 UTC m=+157.301136929" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.581182 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:49 crc kubenswrapper[4834]: E1008 22:25:49.581520 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:50.081507888 +0000 UTC m=+157.904392634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.682767 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:49 crc kubenswrapper[4834]: E1008 22:25:49.683049 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:50.183020133 +0000 UTC m=+158.005904879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.683304 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:49 crc kubenswrapper[4834]: E1008 22:25:49.683703 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:50.183696343 +0000 UTC m=+158.006581089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.784745 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:49 crc kubenswrapper[4834]: E1008 22:25:49.785382 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:50.285351872 +0000 UTC m=+158.108236618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.787179 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:49 crc kubenswrapper[4834]: E1008 22:25:49.787823 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:50.287804581 +0000 UTC m=+158.110689317 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.811637 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9lpgw" event={"ID":"c012967e-0c3a-4ce9-9c65-b0b84e3c0308","Type":"ContainerStarted","Data":"d2c80619211840d07995289c485be6433e091b255e8fd28043cee57c4ae24312"} Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.814873 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" event={"ID":"936e07e4-586f-4092-ad38-4e49512485a5","Type":"ContainerStarted","Data":"f6f4ff87f32d6a13c58ce2fd5b815b18670f8fb7f6d3411efb442466a15a7336"} Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.814975 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.817869 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qqn48" event={"ID":"77239a2f-ad60-4314-8d76-87449351907a","Type":"ContainerStarted","Data":"e16f8b89bb5731a16245221af1bd70173cfda536b8e07276bc550235668c81e7"} Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.819858 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r2fr4" event={"ID":"34467af5-f30c-4d07-a6cb-9d300338f8a1","Type":"ContainerStarted","Data":"17f94182889d3621975ecf646b5ea1c8b3cfa1b99d982016e88813bc9dafe8d7"} Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.822317 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" event={"ID":"6e9dedc4-5147-4980-8501-834fe5976c00","Type":"ContainerStarted","Data":"009513a29ffe383a6d0ec3866ac8794cee01cc547570d5a0d43eaaaada45ba9c"} Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.827443 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" event={"ID":"c9182dd7-0f3e-4916-b53e-0a9de9fffa89","Type":"ContainerStarted","Data":"aca441c088c016b3eb4d5fb8f4cd6e5594c52559e4a9f56c1772f030d89eb777"} Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.830301 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx" event={"ID":"7e8cf764-332f-4e89-ad19-6dad90f92692","Type":"ContainerStarted","Data":"8664e5f1f561a30bfc5b6059f9e537289149975e796d070c625cc9c54fe6bcb5"} Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.830834 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.834275 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk" event={"ID":"1d09cf73-0239-4398-b142-17fc3892aca9","Type":"ContainerStarted","Data":"25886cd393d9425a17715f46e848aab4732b50af8e02e602468ed55de76a7f16"} Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.839515 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9lpgw" podStartSLOduration=130.83949992 podStartE2EDuration="2m10.83949992s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.836725793 +0000 UTC m=+157.659610539" watchObservedRunningTime="2025-10-08 22:25:49.83949992 +0000 UTC m=+157.662384656" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.839634 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" podStartSLOduration=130.839631104 podStartE2EDuration="2m10.839631104s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.505818777 +0000 UTC m=+157.328703523" watchObservedRunningTime="2025-10-08 22:25:49.839631104 +0000 UTC m=+157.662515850" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.841270 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" event={"ID":"1b754933-c10c-4ae3-9590-abca704681c6","Type":"ContainerStarted","Data":"83344c5b6769fbb1d020c3ebb564d829d7febc9983d4b48a9f8531c76de913f0"} Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.841331 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" event={"ID":"1b754933-c10c-4ae3-9590-abca704681c6","Type":"ContainerStarted","Data":"b0ee0df26c2852209b837eabf33623c88defee358f80ead7add300dc46134abc"} Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.845879 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sr6sl" event={"ID":"fe82bfed-8295-42b8-bd74-1883fd35bab6","Type":"ContainerStarted","Data":"428ca5a2218198f50af28189b659c47ab1a4655181f5b7e6c171e0d883e6dd12"} Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.849864 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" event={"ID":"80186c7b-b4a2-4480-a605-18eafe6067fb","Type":"ContainerStarted","Data":"224f3699f11b72dfb76cb9eb19feca860207fe8f2254d3358135592abc617c06"} Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.850496 4834 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-4v5sm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.850555 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" podUID="5496ae0a-5098-49eb-9a39-82e4d0c584bf" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.852718 4834 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-m4dfc container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:5443/healthz\": dial tcp 10.217.0.26:5443: connect: connection refused" start-of-body= Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.852761 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" podUID="bc74cae8-79d9-47fd-aaf4-0bf056569277" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.26:5443/healthz\": dial tcp 10.217.0.26:5443: connect: connection refused" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.854123 4834 patch_prober.go:28] interesting pod/console-operator-58897d9998-x69p6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.854195 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-x69p6" podUID="d832b5a7-1a3a-4da8-8524-6bdf31d4fd94" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.854391 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.854422 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.854445 4834 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zhdx4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.854476 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" podUID="0e8c92dc-69ee-4e97-a5a3-d1c1bd7d4a7d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.854489 4834 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-66msg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.854512 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" podUID="6a1cdcf6-51bb-46e5-a3b7-80210921f1a6" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.861943 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.861998 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.862907 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-q7hzk" podStartSLOduration=130.862887286 podStartE2EDuration="2m10.862887286s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.861834176 +0000 UTC m=+157.684718932" watchObservedRunningTime="2025-10-08 22:25:49.862887286 +0000 UTC m=+157.685772022" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.888235 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:49 crc kubenswrapper[4834]: E1008 22:25:49.890382 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:50.390365306 +0000 UTC m=+158.213250052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.910590 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f6zdn" podStartSLOduration=130.910568853 podStartE2EDuration="2m10.910568853s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.907787015 +0000 UTC m=+157.730671761" watchObservedRunningTime="2025-10-08 22:25:49.910568853 +0000 UTC m=+157.733453599" Oct 08 22:25:49 crc kubenswrapper[4834]: I1008 22:25:49.993654 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:49 crc kubenswrapper[4834]: E1008 22:25:49.994950 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:50.494933288 +0000 UTC m=+158.317818034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.050136 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xh4x8" podStartSLOduration=131.050116234 podStartE2EDuration="2m11.050116234s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:49.956634814 +0000 UTC m=+157.779519550" watchObservedRunningTime="2025-10-08 22:25:50.050116234 +0000 UTC m=+157.873000970" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.096123 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:50 crc kubenswrapper[4834]: E1008 22:25:50.097013 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:50.596993129 +0000 UTC m=+158.419877875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.129865 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx" podStartSLOduration=131.129843239 podStartE2EDuration="2m11.129843239s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:50.054360633 +0000 UTC m=+157.877245379" watchObservedRunningTime="2025-10-08 22:25:50.129843239 +0000 UTC m=+157.952727985" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.130892 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" podStartSLOduration=131.130886759 podStartE2EDuration="2m11.130886759s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:50.127286257 +0000 UTC m=+157.950171003" watchObservedRunningTime="2025-10-08 22:25:50.130886759 +0000 UTC m=+157.953771505" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.168868 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.168915 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.171754 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" podStartSLOduration=131.171738884 podStartE2EDuration="2m11.171738884s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:50.160045736 +0000 UTC m=+157.982930482" watchObservedRunningTime="2025-10-08 22:25:50.171738884 +0000 UTC m=+157.994623630" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.175768 4834 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-9v995 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.175822 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" podUID="80186c7b-b4a2-4480-a605-18eafe6067fb" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/livez\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.197645 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:50 crc kubenswrapper[4834]: E1008 22:25:50.198102 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:50.698088623 +0000 UTC m=+158.520973369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.219777 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-sr6sl" podStartSLOduration=131.219757549 podStartE2EDuration="2m11.219757549s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:50.219017819 +0000 UTC m=+158.041902565" watchObservedRunningTime="2025-10-08 22:25:50.219757549 +0000 UTC m=+158.042642295" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.298646 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:50 crc kubenswrapper[4834]: E1008 22:25:50.299027 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:50.799005801 +0000 UTC m=+158.621890547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.355260 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7ktg9" podStartSLOduration=131.355238048 podStartE2EDuration="2m11.355238048s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:50.293747204 +0000 UTC m=+158.116631940" watchObservedRunningTime="2025-10-08 22:25:50.355238048 +0000 UTC m=+158.178122794" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.355739 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-m5dj6" podStartSLOduration=131.355735352 podStartE2EDuration="2m11.355735352s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:50.35140647 +0000 UTC m=+158.174291216" watchObservedRunningTime="2025-10-08 22:25:50.355735352 +0000 UTC m=+158.178620098" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.379079 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-j64fd" podStartSLOduration=8.379054885 podStartE2EDuration="8.379054885s" podCreationTimestamp="2025-10-08 22:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:50.375491666 +0000 UTC m=+158.198376412" watchObservedRunningTime="2025-10-08 22:25:50.379054885 +0000 UTC m=+158.201939621" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.400298 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:50 crc kubenswrapper[4834]: E1008 22:25:50.400707 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:50.900688852 +0000 UTC m=+158.723573598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.403512 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" podStartSLOduration=131.403500721 podStartE2EDuration="2m11.403500721s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:50.403020347 +0000 UTC m=+158.225905093" watchObservedRunningTime="2025-10-08 22:25:50.403500721 +0000 UTC m=+158.226385467" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.501748 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:50 crc kubenswrapper[4834]: E1008 22:25:50.502156 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:51.002104525 +0000 UTC m=+158.824989271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.603692 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:50 crc kubenswrapper[4834]: E1008 22:25:50.604137 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:51.104118115 +0000 UTC m=+158.927002861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.705559 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:50 crc kubenswrapper[4834]: E1008 22:25:50.705732 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:51.205704323 +0000 UTC m=+159.028589069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.705976 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:50 crc kubenswrapper[4834]: E1008 22:25:50.706437 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:51.206419343 +0000 UTC m=+159.029304089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.807633 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:50 crc kubenswrapper[4834]: E1008 22:25:50.807959 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:51.307936749 +0000 UTC m=+159.130821495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.855377 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qqn48" event={"ID":"77239a2f-ad60-4314-8d76-87449351907a","Type":"ContainerStarted","Data":"94e47d5c1ab3416c3c01f977c692a1b12072f3c49c4a415992cec53ecbbca67e"} Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.858906 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4xwnj" event={"ID":"933ce2c9-9ecb-4c1d-8ff7-d50919a3ce4e","Type":"ContainerStarted","Data":"8ba2140c15199bf026be08a007e8979f6048b069270f391b3fb37462e9d1b985"} Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.859483 4834 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-66msg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.859532 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" podUID="6a1cdcf6-51bb-46e5-a3b7-80210921f1a6" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.860042 4834 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-4v5sm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.860095 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" podUID="5496ae0a-5098-49eb-9a39-82e4d0c584bf" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.867804 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:25:50 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:25:50 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:25:50 crc kubenswrapper[4834]: healthz check failed Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.867870 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.880792 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-r2fr4" podStartSLOduration=8.880770610999999 podStartE2EDuration="8.880770611s" podCreationTimestamp="2025-10-08 22:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:50.880592515 +0000 UTC m=+158.703477261" watchObservedRunningTime="2025-10-08 22:25:50.880770611 +0000 UTC m=+158.703655357" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.906019 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-4xwnj" podStartSLOduration=131.906000008 podStartE2EDuration="2m11.906000008s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:50.90359452 +0000 UTC m=+158.726479266" watchObservedRunningTime="2025-10-08 22:25:50.906000008 +0000 UTC m=+158.728884754" Oct 08 22:25:50 crc kubenswrapper[4834]: I1008 22:25:50.908584 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:50 crc kubenswrapper[4834]: E1008 22:25:50.908985 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:51.408966791 +0000 UTC m=+159.231851537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.009600 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.009757 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:51.509727946 +0000 UTC m=+159.332612692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.010039 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.011498 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:51.511481165 +0000 UTC m=+159.334365911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.111499 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.112052 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:51.612015073 +0000 UTC m=+159.434899819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.213332 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.213731 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:51.713715423 +0000 UTC m=+159.536600169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.314844 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.315034 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:51.815001253 +0000 UTC m=+159.637885999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.315294 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.315661 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:51.815647571 +0000 UTC m=+159.638532317 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.416407 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.416679 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:51.916649432 +0000 UTC m=+159.739534178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.416747 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.417193 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:51.917173688 +0000 UTC m=+159.740058434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.517212 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.517435 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.017403427 +0000 UTC m=+159.840288173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.617938 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.618404 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.118384877 +0000 UTC m=+159.941269623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.718957 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.719107 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.219087391 +0000 UTC m=+160.041972137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.719223 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.719505 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.219497983 +0000 UTC m=+160.042382729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.820062 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.820214 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.320187886 +0000 UTC m=+160.143072642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.820581 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.820937 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.320925966 +0000 UTC m=+160.143810732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.867081 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:25:51 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:25:51 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:25:51 crc kubenswrapper[4834]: healthz check failed Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.867170 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.868394 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" event={"ID":"7c2cc088-ced0-4542-878b-48488976518a","Type":"ContainerStarted","Data":"c422414bd2c98bd455351f3822dc63f4843a709ea95a6cbe1e73b07f51b8eeb4"} Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.905235 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-qqn48" podStartSLOduration=132.905210368 podStartE2EDuration="2m12.905210368s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:25:51.903376677 +0000 UTC m=+159.726261433" watchObservedRunningTime="2025-10-08 22:25:51.905210368 +0000 UTC m=+159.728095124" Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.921477 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.921694 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.4216587 +0000 UTC m=+160.244543456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.921909 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:51 crc kubenswrapper[4834]: E1008 22:25:51.922703 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.422695059 +0000 UTC m=+160.245579805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:51 crc kubenswrapper[4834]: I1008 22:25:51.936420 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-r2fr4" Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.023205 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:52 crc kubenswrapper[4834]: E1008 22:25:52.023456 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.523424163 +0000 UTC m=+160.346308909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.023580 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:52 crc kubenswrapper[4834]: E1008 22:25:52.024260 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.524229065 +0000 UTC m=+160.347113811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.125198 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:52 crc kubenswrapper[4834]: E1008 22:25:52.125478 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.625441583 +0000 UTC m=+160.448326329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.125747 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:52 crc kubenswrapper[4834]: E1008 22:25:52.126188 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.626180063 +0000 UTC m=+160.449064799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.226510 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:52 crc kubenswrapper[4834]: E1008 22:25:52.226825 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.726805484 +0000 UTC m=+160.549690230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.328288 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:52 crc kubenswrapper[4834]: E1008 22:25:52.328779 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.828754742 +0000 UTC m=+160.651639488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.429916 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:52 crc kubenswrapper[4834]: E1008 22:25:52.430201 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:52.930176785 +0000 UTC m=+160.753061531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.531468 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:52 crc kubenswrapper[4834]: E1008 22:25:52.531833 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.031820355 +0000 UTC m=+160.854705101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.633247 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:52 crc kubenswrapper[4834]: E1008 22:25:52.633686 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.133660019 +0000 UTC m=+160.956544765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.735615 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:52 crc kubenswrapper[4834]: E1008 22:25:52.736049 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.236023859 +0000 UTC m=+161.058908605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.836902 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:52 crc kubenswrapper[4834]: E1008 22:25:52.837124 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.337087002 +0000 UTC m=+161.159971748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.837676 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:52 crc kubenswrapper[4834]: E1008 22:25:52.838096 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.33808839 +0000 UTC m=+161.160973126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.853131 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.864655 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:25:52 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:25:52 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:25:52 crc kubenswrapper[4834]: healthz check failed Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.865768 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.938774 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:52 crc kubenswrapper[4834]: E1008 22:25:52.938979 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.438938798 +0000 UTC m=+161.261823544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:52 crc kubenswrapper[4834]: I1008 22:25:52.939081 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:52 crc kubenswrapper[4834]: E1008 22:25:52.939638 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.439613776 +0000 UTC m=+161.262498522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.040457 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.040702 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.540660148 +0000 UTC m=+161.363544894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.041300 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.041805 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.54178522 +0000 UTC m=+161.364669966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.083980 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.084665 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.090026 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.090072 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.134246 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.142790 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.143085 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.643050659 +0000 UTC m=+161.465935415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.143650 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06172a77-1bd6-447d-9276-c4bfd79efba9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"06172a77-1bd6-447d-9276-c4bfd79efba9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.143811 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.143972 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06172a77-1bd6-447d-9276-c4bfd79efba9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"06172a77-1bd6-447d-9276-c4bfd79efba9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.144210 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.644199041 +0000 UTC m=+161.467083787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.244879 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.245588 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.744974306 +0000 UTC m=+161.567859052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.245756 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06172a77-1bd6-447d-9276-c4bfd79efba9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"06172a77-1bd6-447d-9276-c4bfd79efba9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.245914 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06172a77-1bd6-447d-9276-c4bfd79efba9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"06172a77-1bd6-447d-9276-c4bfd79efba9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.246103 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06172a77-1bd6-447d-9276-c4bfd79efba9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"06172a77-1bd6-447d-9276-c4bfd79efba9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.246175 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.246668 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.746647443 +0000 UTC m=+161.569532189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.285028 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06172a77-1bd6-447d-9276-c4bfd79efba9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"06172a77-1bd6-447d-9276-c4bfd79efba9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.347667 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.347909 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.847872721 +0000 UTC m=+161.670757467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.348071 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.348453 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.848436727 +0000 UTC m=+161.671321473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.405374 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.449246 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.449566 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.94953355 +0000 UTC m=+161.772418296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.449952 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.450512 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:53.950487037 +0000 UTC m=+161.773371783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.556334 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.556984 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:54.056948632 +0000 UTC m=+161.879833378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.558486 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.559343 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:54.059324609 +0000 UTC m=+161.882209355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.659683 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.659885 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:54.159852726 +0000 UTC m=+161.982737472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.660274 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.660681 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:54.160672869 +0000 UTC m=+161.983557615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.663781 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hfv2p"] Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.665073 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.673155 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.679350 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hfv2p"] Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.711385 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.761057 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.761446 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:54.261412194 +0000 UTC m=+162.084296940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.761611 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.761675 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db23b8a-4d46-47bb-8ed4-ad6747401463-catalog-content\") pod \"community-operators-hfv2p\" (UID: \"2db23b8a-4d46-47bb-8ed4-ad6747401463\") " pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.761721 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db23b8a-4d46-47bb-8ed4-ad6747401463-utilities\") pod \"community-operators-hfv2p\" (UID: \"2db23b8a-4d46-47bb-8ed4-ad6747401463\") " pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.761759 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxb65\" (UniqueName: \"kubernetes.io/projected/2db23b8a-4d46-47bb-8ed4-ad6747401463-kube-api-access-kxb65\") pod \"community-operators-hfv2p\" (UID: \"2db23b8a-4d46-47bb-8ed4-ad6747401463\") " pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.761950 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:54.261942498 +0000 UTC m=+162.084827244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.847445 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9ct27"] Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.848883 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.853998 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.868208 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.868417 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db23b8a-4d46-47bb-8ed4-ad6747401463-catalog-content\") pod \"community-operators-hfv2p\" (UID: \"2db23b8a-4d46-47bb-8ed4-ad6747401463\") " pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.868458 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db23b8a-4d46-47bb-8ed4-ad6747401463-utilities\") pod \"community-operators-hfv2p\" (UID: \"2db23b8a-4d46-47bb-8ed4-ad6747401463\") " pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.868487 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxb65\" (UniqueName: \"kubernetes.io/projected/2db23b8a-4d46-47bb-8ed4-ad6747401463-kube-api-access-kxb65\") pod \"community-operators-hfv2p\" (UID: \"2db23b8a-4d46-47bb-8ed4-ad6747401463\") " pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.868532 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:25:53 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:25:53 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:25:53 crc kubenswrapper[4834]: healthz check failed Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.868592 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.869008 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:54.368986679 +0000 UTC m=+162.191871425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.869404 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db23b8a-4d46-47bb-8ed4-ad6747401463-catalog-content\") pod \"community-operators-hfv2p\" (UID: \"2db23b8a-4d46-47bb-8ed4-ad6747401463\") " pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.869443 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db23b8a-4d46-47bb-8ed4-ad6747401463-utilities\") pod \"community-operators-hfv2p\" (UID: \"2db23b8a-4d46-47bb-8ed4-ad6747401463\") " pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.873651 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9ct27"] Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.903979 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"06172a77-1bd6-447d-9276-c4bfd79efba9","Type":"ContainerStarted","Data":"dcaf22d04ea5cdcdd8ddc59943f225b376d050e62ee4ade1ae54223370109377"} Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.919292 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxb65\" (UniqueName: \"kubernetes.io/projected/2db23b8a-4d46-47bb-8ed4-ad6747401463-kube-api-access-kxb65\") pod \"community-operators-hfv2p\" (UID: \"2db23b8a-4d46-47bb-8ed4-ad6747401463\") " pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.970396 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edebf05e-6b29-4d72-9805-66328cac3d49-utilities\") pod \"certified-operators-9ct27\" (UID: \"edebf05e-6b29-4d72-9805-66328cac3d49\") " pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.970467 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pcg8\" (UniqueName: \"kubernetes.io/projected/edebf05e-6b29-4d72-9805-66328cac3d49-kube-api-access-6pcg8\") pod \"certified-operators-9ct27\" (UID: \"edebf05e-6b29-4d72-9805-66328cac3d49\") " pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.970520 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:53 crc kubenswrapper[4834]: I1008 22:25:53.970544 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edebf05e-6b29-4d72-9805-66328cac3d49-catalog-content\") pod \"certified-operators-9ct27\" (UID: \"edebf05e-6b29-4d72-9805-66328cac3d49\") " pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:25:53 crc kubenswrapper[4834]: E1008 22:25:53.970894 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:54.470877436 +0000 UTC m=+162.293762182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.014394 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.051109 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c42p9"] Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.052138 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.071631 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c42p9"] Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.072045 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.072451 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edebf05e-6b29-4d72-9805-66328cac3d49-catalog-content\") pod \"certified-operators-9ct27\" (UID: \"edebf05e-6b29-4d72-9805-66328cac3d49\") " pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.072510 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edebf05e-6b29-4d72-9805-66328cac3d49-utilities\") pod \"certified-operators-9ct27\" (UID: \"edebf05e-6b29-4d72-9805-66328cac3d49\") " pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.072567 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pcg8\" (UniqueName: \"kubernetes.io/projected/edebf05e-6b29-4d72-9805-66328cac3d49-kube-api-access-6pcg8\") pod \"certified-operators-9ct27\" (UID: \"edebf05e-6b29-4d72-9805-66328cac3d49\") " pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.073106 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:54.573084401 +0000 UTC m=+162.395969147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.073613 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edebf05e-6b29-4d72-9805-66328cac3d49-catalog-content\") pod \"certified-operators-9ct27\" (UID: \"edebf05e-6b29-4d72-9805-66328cac3d49\") " pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.073715 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edebf05e-6b29-4d72-9805-66328cac3d49-utilities\") pod \"certified-operators-9ct27\" (UID: \"edebf05e-6b29-4d72-9805-66328cac3d49\") " pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.094489 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pcg8\" (UniqueName: \"kubernetes.io/projected/edebf05e-6b29-4d72-9805-66328cac3d49-kube-api-access-6pcg8\") pod \"certified-operators-9ct27\" (UID: \"edebf05e-6b29-4d72-9805-66328cac3d49\") " pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.129880 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2vxck" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.174435 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-utilities\") pod \"community-operators-c42p9\" (UID: \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\") " pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.174517 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-catalog-content\") pod \"community-operators-c42p9\" (UID: \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\") " pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.174566 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzzd4\" (UniqueName: \"kubernetes.io/projected/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-kube-api-access-pzzd4\") pod \"community-operators-c42p9\" (UID: \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\") " pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.174597 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.175047 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:54.675027538 +0000 UTC m=+162.497912294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.177523 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.247926 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-29sz4"] Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.249574 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.272715 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-29sz4"] Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.275813 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hfv2p"] Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.276478 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.276613 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:54.776590926 +0000 UTC m=+162.599475672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.276787 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee84722f-b4c1-40dd-bc05-1df112e48a98-catalog-content\") pod \"certified-operators-29sz4\" (UID: \"ee84722f-b4c1-40dd-bc05-1df112e48a98\") " pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.276846 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee84722f-b4c1-40dd-bc05-1df112e48a98-utilities\") pod \"certified-operators-29sz4\" (UID: \"ee84722f-b4c1-40dd-bc05-1df112e48a98\") " pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.276945 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-utilities\") pod \"community-operators-c42p9\" (UID: \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\") " pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.277025 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-catalog-content\") pod \"community-operators-c42p9\" (UID: \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\") " pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.277082 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krbzl\" (UniqueName: \"kubernetes.io/projected/ee84722f-b4c1-40dd-bc05-1df112e48a98-kube-api-access-krbzl\") pod \"certified-operators-29sz4\" (UID: \"ee84722f-b4c1-40dd-bc05-1df112e48a98\") " pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.277112 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzzd4\" (UniqueName: \"kubernetes.io/projected/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-kube-api-access-pzzd4\") pod \"community-operators-c42p9\" (UID: \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\") " pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.277153 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.277505 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:54.777482061 +0000 UTC m=+162.600366827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: W1008 22:25:54.278529 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2db23b8a_4d46_47bb_8ed4_ad6747401463.slice/crio-2f1bca872d66dacd4e0c589e7f6f0f9a30402ad3a74447768ce7627b5be169f8 WatchSource:0}: Error finding container 2f1bca872d66dacd4e0c589e7f6f0f9a30402ad3a74447768ce7627b5be169f8: Status 404 returned error can't find the container with id 2f1bca872d66dacd4e0c589e7f6f0f9a30402ad3a74447768ce7627b5be169f8 Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.313185 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-utilities\") pod \"community-operators-c42p9\" (UID: \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\") " pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.314002 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-catalog-content\") pod \"community-operators-c42p9\" (UID: \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\") " pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.340457 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzzd4\" (UniqueName: \"kubernetes.io/projected/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-kube-api-access-pzzd4\") pod \"community-operators-c42p9\" (UID: \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\") " pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.368483 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.378893 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.379114 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee84722f-b4c1-40dd-bc05-1df112e48a98-utilities\") pod \"certified-operators-29sz4\" (UID: \"ee84722f-b4c1-40dd-bc05-1df112e48a98\") " pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.379216 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krbzl\" (UniqueName: \"kubernetes.io/projected/ee84722f-b4c1-40dd-bc05-1df112e48a98-kube-api-access-krbzl\") pod \"certified-operators-29sz4\" (UID: \"ee84722f-b4c1-40dd-bc05-1df112e48a98\") " pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.379277 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee84722f-b4c1-40dd-bc05-1df112e48a98-catalog-content\") pod \"certified-operators-29sz4\" (UID: \"ee84722f-b4c1-40dd-bc05-1df112e48a98\") " pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.379770 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee84722f-b4c1-40dd-bc05-1df112e48a98-catalog-content\") pod \"certified-operators-29sz4\" (UID: \"ee84722f-b4c1-40dd-bc05-1df112e48a98\") " pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.379869 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:54.879850821 +0000 UTC m=+162.702735567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.380076 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee84722f-b4c1-40dd-bc05-1df112e48a98-utilities\") pod \"certified-operators-29sz4\" (UID: \"ee84722f-b4c1-40dd-bc05-1df112e48a98\") " pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.470234 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krbzl\" (UniqueName: \"kubernetes.io/projected/ee84722f-b4c1-40dd-bc05-1df112e48a98-kube-api-access-krbzl\") pod \"certified-operators-29sz4\" (UID: \"ee84722f-b4c1-40dd-bc05-1df112e48a98\") " pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.483057 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.483566 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:54.983546997 +0000 UTC m=+162.806431743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.546999 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9ct27"] Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.582599 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.583588 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.584011 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.083960092 +0000 UTC m=+162.906844848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.584130 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.584541 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.084525618 +0000 UTC m=+162.907410354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.660083 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c42p9"] Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.685386 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.686170 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.186121856 +0000 UTC m=+163.009006602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.687964 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.688520 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.188502923 +0000 UTC m=+163.011387669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.789192 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.789419 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.28937697 +0000 UTC m=+163.112261726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.789649 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.790413 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.290381089 +0000 UTC m=+163.113265875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: W1008 22:25:54.813293 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd19ec829_d35e_4acb_a5bb_8f2e81a67f4d.slice/crio-5cdf7979c6db74b949f781fb0c3fe9946e6a6b682d96d1886afe50de50dd0cdd WatchSource:0}: Error finding container 5cdf7979c6db74b949f781fb0c3fe9946e6a6b682d96d1886afe50de50dd0cdd: Status 404 returned error can't find the container with id 5cdf7979c6db74b949f781fb0c3fe9946e6a6b682d96d1886afe50de50dd0cdd Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.860800 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:25:54 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:25:54 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:25:54 crc kubenswrapper[4834]: healthz check failed Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.860873 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.891851 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.891964 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.391934665 +0000 UTC m=+163.214819411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.892126 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.892392 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.392385558 +0000 UTC m=+163.215270304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.910197 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c42p9" event={"ID":"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d","Type":"ContainerStarted","Data":"5cdf7979c6db74b949f781fb0c3fe9946e6a6b682d96d1886afe50de50dd0cdd"} Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.911770 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"06172a77-1bd6-447d-9276-c4bfd79efba9","Type":"ContainerStarted","Data":"e94bbc0a4565e4cfd094196409334111eb2342ed9648538344ac2dbf7a048abf"} Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.912737 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ct27" event={"ID":"edebf05e-6b29-4d72-9805-66328cac3d49","Type":"ContainerStarted","Data":"411b58f88129e2a07e63223431d3333ae8abc74abbc78d4b0b5c0b348bb72bc9"} Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.917455 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfv2p" event={"ID":"2db23b8a-4d46-47bb-8ed4-ad6747401463","Type":"ContainerStarted","Data":"2f1bca872d66dacd4e0c589e7f6f0f9a30402ad3a74447768ce7627b5be169f8"} Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.976513 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.976565 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.980460 4834 patch_prober.go:28] interesting pod/apiserver-76f77b778f-qqn48 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.980522 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-qqn48" podUID="77239a2f-ad60-4314-8d76-87449351907a" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.993587 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.993745 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.493716519 +0000 UTC m=+163.316601265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.994397 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:54 crc kubenswrapper[4834]: E1008 22:25:54.994717 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.494709557 +0000 UTC m=+163.317594303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:54 crc kubenswrapper[4834]: I1008 22:25:54.997505 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-x69p6" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.033344 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.033403 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.036822 4834 patch_prober.go:28] interesting pod/console-f9d7485db-lbmrk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.036873 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lbmrk" podUID="c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.046363 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.046420 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.046363 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.046625 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.095056 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.096580 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.596553732 +0000 UTC m=+163.419438478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.098197 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.098661 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.59864174 +0000 UTC m=+163.421526486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.099636 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-29sz4"] Oct 08 22:25:55 crc kubenswrapper[4834]: W1008 22:25:55.106091 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee84722f_b4c1_40dd_bc05_1df112e48a98.slice/crio-666777364ded21529fb548c4862e4dc9d41ec94189385c3dc575f75da4b0b2a0 WatchSource:0}: Error finding container 666777364ded21529fb548c4862e4dc9d41ec94189385c3dc575f75da4b0b2a0: Status 404 returned error can't find the container with id 666777364ded21529fb548c4862e4dc9d41ec94189385c3dc575f75da4b0b2a0 Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.174805 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.181427 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.182215 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-9v995" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.199881 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.200072 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.700042742 +0000 UTC m=+163.522927498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.200176 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.200561 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.700541177 +0000 UTC m=+163.523425923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.301747 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.302204 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.802168875 +0000 UTC m=+163.625053621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.302471 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.304689 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.804663745 +0000 UTC m=+163.627548501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.404587 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.404886 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.904859455 +0000 UTC m=+163.727744211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.405063 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.405682 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:55.905658286 +0000 UTC m=+163.728543022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.506396 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.506637 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.006603976 +0000 UTC m=+163.829488722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.506796 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.507162 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.007131032 +0000 UTC m=+163.830015778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.529612 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhdx4" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.608125 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.608313 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.108274737 +0000 UTC m=+163.931159483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.608693 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.609283 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.109259755 +0000 UTC m=+163.932144501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.655111 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-66msg" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.710351 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.710568 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.210548934 +0000 UTC m=+164.033433680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.710651 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.711125 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.211102909 +0000 UTC m=+164.033987655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.812516 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.312484932 +0000 UTC m=+164.135369678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.812761 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.813075 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.813451 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.313443479 +0000 UTC m=+164.136328215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.824648 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m4dfc" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.850721 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sg2s9"] Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.852664 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.862303 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.862557 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.874942 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:25:55 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:25:55 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:25:55 crc kubenswrapper[4834]: healthz check failed Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.875018 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.882942 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sg2s9"] Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.902412 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.908582 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.916004 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.916294 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.416277441 +0000 UTC m=+164.239162187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.916427 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fnd8\" (UniqueName: \"kubernetes.io/projected/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-kube-api-access-9fnd8\") pod \"redhat-marketplace-sg2s9\" (UID: \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\") " pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.917726 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.917772 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-utilities\") pod \"redhat-marketplace-sg2s9\" (UID: \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\") " pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.917794 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-catalog-content\") pod \"redhat-marketplace-sg2s9\" (UID: \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\") " pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:25:55 crc kubenswrapper[4834]: E1008 22:25:55.918189 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.418181264 +0000 UTC m=+164.241066010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.944654 4834 generic.go:334] "Generic (PLEG): container finished" podID="edebf05e-6b29-4d72-9805-66328cac3d49" containerID="0aaeed775a847f54b20a5c164d8a5aada090cf80ce272f9edcd74ef77a5379b0" exitCode=0 Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.944780 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ct27" event={"ID":"edebf05e-6b29-4d72-9805-66328cac3d49","Type":"ContainerDied","Data":"0aaeed775a847f54b20a5c164d8a5aada090cf80ce272f9edcd74ef77a5379b0"} Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.961071 4834 generic.go:334] "Generic (PLEG): container finished" podID="3c302d50-f83a-448d-a914-905ec04ada98" containerID="d60f771dc7d8aa571922d2bdcb0cccf4509c88a3299a79da67ff81630f1770ca" exitCode=0 Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.961221 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" event={"ID":"3c302d50-f83a-448d-a914-905ec04ada98","Type":"ContainerDied","Data":"d60f771dc7d8aa571922d2bdcb0cccf4509c88a3299a79da67ff81630f1770ca"} Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.962723 4834 generic.go:334] "Generic (PLEG): container finished" podID="ee84722f-b4c1-40dd-bc05-1df112e48a98" containerID="a4df8c06f8e4ae8029011e5bf9c8bf6a385339e70b63a2c80412f5a57575fe5c" exitCode=0 Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.962793 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29sz4" event={"ID":"ee84722f-b4c1-40dd-bc05-1df112e48a98","Type":"ContainerDied","Data":"a4df8c06f8e4ae8029011e5bf9c8bf6a385339e70b63a2c80412f5a57575fe5c"} Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.962814 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29sz4" event={"ID":"ee84722f-b4c1-40dd-bc05-1df112e48a98","Type":"ContainerStarted","Data":"666777364ded21529fb548c4862e4dc9d41ec94189385c3dc575f75da4b0b2a0"} Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.963769 4834 generic.go:334] "Generic (PLEG): container finished" podID="2db23b8a-4d46-47bb-8ed4-ad6747401463" containerID="157676f7b443e0637a234fa3eadb56a071012dc7292d28101b32adecf02c979e" exitCode=0 Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.963817 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfv2p" event={"ID":"2db23b8a-4d46-47bb-8ed4-ad6747401463","Type":"ContainerDied","Data":"157676f7b443e0637a234fa3eadb56a071012dc7292d28101b32adecf02c979e"} Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.965550 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.968475 4834 generic.go:334] "Generic (PLEG): container finished" podID="d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" containerID="15242a93cdc3e69ba30e51580817d20c01c30a581b7ec90f8ef23a32ead5b5ce" exitCode=0 Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.968564 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c42p9" event={"ID":"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d","Type":"ContainerDied","Data":"15242a93cdc3e69ba30e51580817d20c01c30a581b7ec90f8ef23a32ead5b5ce"} Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.970622 4834 generic.go:334] "Generic (PLEG): container finished" podID="06172a77-1bd6-447d-9276-c4bfd79efba9" containerID="e94bbc0a4565e4cfd094196409334111eb2342ed9648538344ac2dbf7a048abf" exitCode=0 Oct 08 22:25:55 crc kubenswrapper[4834]: I1008 22:25:55.971551 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"06172a77-1bd6-447d-9276-c4bfd79efba9","Type":"ContainerDied","Data":"e94bbc0a4565e4cfd094196409334111eb2342ed9648538344ac2dbf7a048abf"} Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.020077 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.020324 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fnd8\" (UniqueName: \"kubernetes.io/projected/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-kube-api-access-9fnd8\") pod \"redhat-marketplace-sg2s9\" (UID: \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\") " pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.020478 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-utilities\") pod \"redhat-marketplace-sg2s9\" (UID: \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\") " pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.020516 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-catalog-content\") pod \"redhat-marketplace-sg2s9\" (UID: \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\") " pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.021053 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-catalog-content\") pod \"redhat-marketplace-sg2s9\" (UID: \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\") " pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:25:56 crc kubenswrapper[4834]: E1008 22:25:56.021134 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.52111812 +0000 UTC m=+164.344002866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.021761 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-utilities\") pod \"redhat-marketplace-sg2s9\" (UID: \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\") " pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.046974 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fnd8\" (UniqueName: \"kubernetes.io/projected/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-kube-api-access-9fnd8\") pod \"redhat-marketplace-sg2s9\" (UID: \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\") " pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.085257 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.122170 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:56 crc kubenswrapper[4834]: E1008 22:25:56.122510 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.622496632 +0000 UTC m=+164.445381378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.223728 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:56 crc kubenswrapper[4834]: E1008 22:25:56.223939 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.723908275 +0000 UTC m=+164.546793021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.224071 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:56 crc kubenswrapper[4834]: E1008 22:25:56.224403 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.724389308 +0000 UTC m=+164.547274054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.249367 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9bmkb"] Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.250707 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.269871 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bmkb"] Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.325186 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.325425 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-utilities\") pod \"redhat-marketplace-9bmkb\" (UID: \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\") " pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.325487 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4brzp\" (UniqueName: \"kubernetes.io/projected/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-kube-api-access-4brzp\") pod \"redhat-marketplace-9bmkb\" (UID: \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\") " pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.325533 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-catalog-content\") pod \"redhat-marketplace-9bmkb\" (UID: \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\") " pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:25:56 crc kubenswrapper[4834]: E1008 22:25:56.325710 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.825674877 +0000 UTC m=+164.648559623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.396330 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sg2s9"] Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.426593 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-catalog-content\") pod \"redhat-marketplace-9bmkb\" (UID: \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\") " pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.426917 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-utilities\") pod \"redhat-marketplace-9bmkb\" (UID: \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\") " pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.427038 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.427192 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4brzp\" (UniqueName: \"kubernetes.io/projected/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-kube-api-access-4brzp\") pod \"redhat-marketplace-9bmkb\" (UID: \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\") " pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.430451 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-utilities\") pod \"redhat-marketplace-9bmkb\" (UID: \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\") " pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.430751 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-catalog-content\") pod \"redhat-marketplace-9bmkb\" (UID: \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\") " pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:25:56 crc kubenswrapper[4834]: E1008 22:25:56.440339 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:56.940301861 +0000 UTC m=+164.763186607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.448984 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4brzp\" (UniqueName: \"kubernetes.io/projected/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-kube-api-access-4brzp\") pod \"redhat-marketplace-9bmkb\" (UID: \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\") " pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.528483 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:56 crc kubenswrapper[4834]: E1008 22:25:56.529022 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:57.028991758 +0000 UTC m=+164.851876494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.581455 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.630001 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:56 crc kubenswrapper[4834]: E1008 22:25:56.632099 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:57.132081127 +0000 UTC m=+164.954965873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.732158 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:56 crc kubenswrapper[4834]: E1008 22:25:56.732902 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:57.232862562 +0000 UTC m=+165.055747318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.733092 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:56 crc kubenswrapper[4834]: E1008 22:25:56.733793 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:57.233783338 +0000 UTC m=+165.056668094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.834693 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:56 crc kubenswrapper[4834]: E1008 22:25:56.834914 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:57.334880002 +0000 UTC m=+165.157764748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.835125 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:56 crc kubenswrapper[4834]: E1008 22:25:56.835471 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:57.335464078 +0000 UTC m=+165.158348814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.849905 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nnhss"] Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.851005 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.853956 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.859213 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnhss"] Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.860386 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:25:56 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:25:56 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:25:56 crc kubenswrapper[4834]: healthz check failed Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.860449 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.936185 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.936395 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.936494 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpj9w\" (UniqueName: \"kubernetes.io/projected/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-kube-api-access-wpj9w\") pod \"redhat-operators-nnhss\" (UID: \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\") " pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.936516 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-catalog-content\") pod \"redhat-operators-nnhss\" (UID: \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\") " pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.936602 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-utilities\") pod \"redhat-operators-nnhss\" (UID: \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\") " pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:25:56 crc kubenswrapper[4834]: E1008 22:25:56.936737 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:57.436709507 +0000 UTC m=+165.259594253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.937573 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.940108 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.943675 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.948407 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.976664 4834 generic.go:334] "Generic (PLEG): container finished" podID="e106fe24-6e8b-4afc-be9f-19de99e7bb9b" containerID="a33f2f4c3b4778d0b0e07fefbb6b1e12588f34bd8c7cca990576326084e90e9c" exitCode=0 Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.977492 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sg2s9" event={"ID":"e106fe24-6e8b-4afc-be9f-19de99e7bb9b","Type":"ContainerDied","Data":"a33f2f4c3b4778d0b0e07fefbb6b1e12588f34bd8c7cca990576326084e90e9c"} Oct 08 22:25:56 crc kubenswrapper[4834]: I1008 22:25:56.977522 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sg2s9" event={"ID":"e106fe24-6e8b-4afc-be9f-19de99e7bb9b","Type":"ContainerStarted","Data":"d285dc12d3d0264c4fd821e3a39521fd200a4157b8cff4cf22037d45ecc0f07f"} Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.041652 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a78eee7-8970-436a-94e9-985bbb0bee86-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7a78eee7-8970-436a-94e9-985bbb0bee86\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.041704 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpj9w\" (UniqueName: \"kubernetes.io/projected/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-kube-api-access-wpj9w\") pod \"redhat-operators-nnhss\" (UID: \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\") " pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.041721 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-catalog-content\") pod \"redhat-operators-nnhss\" (UID: \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\") " pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.041841 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.041865 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-utilities\") pod \"redhat-operators-nnhss\" (UID: \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\") " pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.041879 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a78eee7-8970-436a-94e9-985bbb0bee86-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7a78eee7-8970-436a-94e9-985bbb0bee86\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 22:25:57 crc kubenswrapper[4834]: E1008 22:25:57.042294 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:57.542280816 +0000 UTC m=+165.365165562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.042323 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-catalog-content\") pod \"redhat-operators-nnhss\" (UID: \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\") " pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.042688 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-utilities\") pod \"redhat-operators-nnhss\" (UID: \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\") " pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.044682 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bmkb"] Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.067211 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpj9w\" (UniqueName: \"kubernetes.io/projected/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-kube-api-access-wpj9w\") pod \"redhat-operators-nnhss\" (UID: \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\") " pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.143669 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:57 crc kubenswrapper[4834]: E1008 22:25:57.143795 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:57.643774012 +0000 UTC m=+165.466658758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.144253 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.144297 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a78eee7-8970-436a-94e9-985bbb0bee86-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7a78eee7-8970-436a-94e9-985bbb0bee86\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.144341 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a78eee7-8970-436a-94e9-985bbb0bee86-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7a78eee7-8970-436a-94e9-985bbb0bee86\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.144423 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a78eee7-8970-436a-94e9-985bbb0bee86-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7a78eee7-8970-436a-94e9-985bbb0bee86\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 22:25:57 crc kubenswrapper[4834]: E1008 22:25:57.144694 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:57.644686007 +0000 UTC m=+165.467570753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.163316 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a78eee7-8970-436a-94e9-985bbb0bee86-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7a78eee7-8970-436a-94e9-985bbb0bee86\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.220460 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.246905 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8nbhp"] Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.247541 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:57 crc kubenswrapper[4834]: E1008 22:25:57.248328 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:57.74826549 +0000 UTC m=+165.571150236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.249567 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.256017 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.260973 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8nbhp"] Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.282799 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.330267 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.349671 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06172a77-1bd6-447d-9276-c4bfd79efba9-kubelet-dir\") pod \"06172a77-1bd6-447d-9276-c4bfd79efba9\" (UID: \"06172a77-1bd6-447d-9276-c4bfd79efba9\") " Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.349807 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06172a77-1bd6-447d-9276-c4bfd79efba9-kube-api-access\") pod \"06172a77-1bd6-447d-9276-c4bfd79efba9\" (UID: \"06172a77-1bd6-447d-9276-c4bfd79efba9\") " Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.350133 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d78fw\" (UniqueName: \"kubernetes.io/projected/78932c45-91a2-46b0-9b9d-73f4c14d2706-kube-api-access-d78fw\") pod \"redhat-operators-8nbhp\" (UID: \"78932c45-91a2-46b0-9b9d-73f4c14d2706\") " pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.350231 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78932c45-91a2-46b0-9b9d-73f4c14d2706-utilities\") pod \"redhat-operators-8nbhp\" (UID: \"78932c45-91a2-46b0-9b9d-73f4c14d2706\") " pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.350280 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78932c45-91a2-46b0-9b9d-73f4c14d2706-catalog-content\") pod \"redhat-operators-8nbhp\" (UID: \"78932c45-91a2-46b0-9b9d-73f4c14d2706\") " pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.350312 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.350409 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06172a77-1bd6-447d-9276-c4bfd79efba9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "06172a77-1bd6-447d-9276-c4bfd79efba9" (UID: "06172a77-1bd6-447d-9276-c4bfd79efba9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:25:57 crc kubenswrapper[4834]: E1008 22:25:57.350661 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:57.85064649 +0000 UTC m=+165.673531236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.374555 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06172a77-1bd6-447d-9276-c4bfd79efba9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "06172a77-1bd6-447d-9276-c4bfd79efba9" (UID: "06172a77-1bd6-447d-9276-c4bfd79efba9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.452635 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c302d50-f83a-448d-a914-905ec04ada98-secret-volume\") pod \"3c302d50-f83a-448d-a914-905ec04ada98\" (UID: \"3c302d50-f83a-448d-a914-905ec04ada98\") " Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.453550 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c302d50-f83a-448d-a914-905ec04ada98-config-volume\") pod \"3c302d50-f83a-448d-a914-905ec04ada98\" (UID: \"3c302d50-f83a-448d-a914-905ec04ada98\") " Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.453755 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.453815 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pw26\" (UniqueName: \"kubernetes.io/projected/3c302d50-f83a-448d-a914-905ec04ada98-kube-api-access-6pw26\") pod \"3c302d50-f83a-448d-a914-905ec04ada98\" (UID: \"3c302d50-f83a-448d-a914-905ec04ada98\") " Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.454066 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d78fw\" (UniqueName: \"kubernetes.io/projected/78932c45-91a2-46b0-9b9d-73f4c14d2706-kube-api-access-d78fw\") pod \"redhat-operators-8nbhp\" (UID: \"78932c45-91a2-46b0-9b9d-73f4c14d2706\") " pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.454222 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78932c45-91a2-46b0-9b9d-73f4c14d2706-utilities\") pod \"redhat-operators-8nbhp\" (UID: \"78932c45-91a2-46b0-9b9d-73f4c14d2706\") " pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.454341 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78932c45-91a2-46b0-9b9d-73f4c14d2706-catalog-content\") pod \"redhat-operators-8nbhp\" (UID: \"78932c45-91a2-46b0-9b9d-73f4c14d2706\") " pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.454431 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06172a77-1bd6-447d-9276-c4bfd79efba9-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.454444 4834 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06172a77-1bd6-447d-9276-c4bfd79efba9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.455404 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c302d50-f83a-448d-a914-905ec04ada98-config-volume" (OuterVolumeSpecName: "config-volume") pod "3c302d50-f83a-448d-a914-905ec04ada98" (UID: "3c302d50-f83a-448d-a914-905ec04ada98"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.455722 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78932c45-91a2-46b0-9b9d-73f4c14d2706-catalog-content\") pod \"redhat-operators-8nbhp\" (UID: \"78932c45-91a2-46b0-9b9d-73f4c14d2706\") " pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:25:57 crc kubenswrapper[4834]: E1008 22:25:57.455738 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:57.955704435 +0000 UTC m=+165.778589191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.455963 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78932c45-91a2-46b0-9b9d-73f4c14d2706-utilities\") pod \"redhat-operators-8nbhp\" (UID: \"78932c45-91a2-46b0-9b9d-73f4c14d2706\") " pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.468768 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c302d50-f83a-448d-a914-905ec04ada98-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3c302d50-f83a-448d-a914-905ec04ada98" (UID: "3c302d50-f83a-448d-a914-905ec04ada98"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.483509 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d78fw\" (UniqueName: \"kubernetes.io/projected/78932c45-91a2-46b0-9b9d-73f4c14d2706-kube-api-access-d78fw\") pod \"redhat-operators-8nbhp\" (UID: \"78932c45-91a2-46b0-9b9d-73f4c14d2706\") " pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.490961 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c302d50-f83a-448d-a914-905ec04ada98-kube-api-access-6pw26" (OuterVolumeSpecName: "kube-api-access-6pw26") pod "3c302d50-f83a-448d-a914-905ec04ada98" (UID: "3c302d50-f83a-448d-a914-905ec04ada98"). InnerVolumeSpecName "kube-api-access-6pw26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.556295 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.556449 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pw26\" (UniqueName: \"kubernetes.io/projected/3c302d50-f83a-448d-a914-905ec04ada98-kube-api-access-6pw26\") on node \"crc\" DevicePath \"\"" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.556466 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3c302d50-f83a-448d-a914-905ec04ada98-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.556478 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3c302d50-f83a-448d-a914-905ec04ada98-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:25:57 crc kubenswrapper[4834]: E1008 22:25:57.556776 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.056757739 +0000 UTC m=+165.879642485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.587686 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nnhss"] Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.591218 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 08 22:25:57 crc kubenswrapper[4834]: W1008 22:25:57.595324 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7a78eee7_8970_436a_94e9_985bbb0bee86.slice/crio-3c7c60ad130c2ac7720a3e3105513eaf4e73dadfe9bc80d0e0a4cb4be57fed1d WatchSource:0}: Error finding container 3c7c60ad130c2ac7720a3e3105513eaf4e73dadfe9bc80d0e0a4cb4be57fed1d: Status 404 returned error can't find the container with id 3c7c60ad130c2ac7720a3e3105513eaf4e73dadfe9bc80d0e0a4cb4be57fed1d Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.595624 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.660923 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:57 crc kubenswrapper[4834]: E1008 22:25:57.661274 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.161224057 +0000 UTC m=+165.984108873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.662320 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:57 crc kubenswrapper[4834]: E1008 22:25:57.662722 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.162705518 +0000 UTC m=+165.985590354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.766922 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:57 crc kubenswrapper[4834]: E1008 22:25:57.767971 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.267949629 +0000 UTC m=+166.090834375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.791304 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8nbhp"] Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.859832 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:25:57 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:25:57 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:25:57 crc kubenswrapper[4834]: healthz check failed Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.859892 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.870449 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:57 crc kubenswrapper[4834]: E1008 22:25:57.871722 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.371700858 +0000 UTC m=+166.194585604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.972079 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:57 crc kubenswrapper[4834]: E1008 22:25:57.972351 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.472303257 +0000 UTC m=+166.295188003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.972488 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:57 crc kubenswrapper[4834]: E1008 22:25:57.972940 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.472932206 +0000 UTC m=+166.295816952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.983371 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"06172a77-1bd6-447d-9276-c4bfd79efba9","Type":"ContainerDied","Data":"dcaf22d04ea5cdcdd8ddc59943f225b376d050e62ee4ade1ae54223370109377"} Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.983419 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.983427 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcaf22d04ea5cdcdd8ddc59943f225b376d050e62ee4ade1ae54223370109377" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.984306 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnhss" event={"ID":"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca","Type":"ContainerStarted","Data":"52b561330621f4e787efc0b7047b42b477a53109a9782097b61371c3efa56f84"} Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.986051 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7a78eee7-8970-436a-94e9-985bbb0bee86","Type":"ContainerStarted","Data":"3c7c60ad130c2ac7720a3e3105513eaf4e73dadfe9bc80d0e0a4cb4be57fed1d"} Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.987171 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" event={"ID":"3c302d50-f83a-448d-a914-905ec04ada98","Type":"ContainerDied","Data":"ec42d2c109c1c60c0ebe472037628bb6b223cd8f16bb23ed3cbcd04ef097dad7"} Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.987197 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec42d2c109c1c60c0ebe472037628bb6b223cd8f16bb23ed3cbcd04ef097dad7" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.987269 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb" Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.992796 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bmkb" event={"ID":"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53","Type":"ContainerStarted","Data":"efad30f15755ee0358c78d1e27d16c3b286b57c61937d3f08cdef90b89d1cc2c"} Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.992873 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bmkb" event={"ID":"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53","Type":"ContainerStarted","Data":"cadd03fe17a6a6292927b0b9604fecd06324905c7289dd59d68891170657aea1"} Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.994880 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" event={"ID":"7c2cc088-ced0-4542-878b-48488976518a","Type":"ContainerStarted","Data":"f160023049442b02bd4d824a1ace61a70178a427067164e2de6e6c9b4acdcb88"} Oct 08 22:25:57 crc kubenswrapper[4834]: I1008 22:25:57.995998 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8nbhp" event={"ID":"78932c45-91a2-46b0-9b9d-73f4c14d2706","Type":"ContainerStarted","Data":"04843fc2450d652ccaf4271ba73dcd872adf5c1c3feb344afffa713ded265616"} Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.073755 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.074011 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.573961207 +0000 UTC m=+166.396845953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.074132 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.074530 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.574515343 +0000 UTC m=+166.397400089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.176028 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.176442 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.676388769 +0000 UTC m=+166.499273555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.177196 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.177754 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.677733556 +0000 UTC m=+166.500618342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.279005 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.279264 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.779233382 +0000 UTC m=+166.602118128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.279446 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.279818 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.779810448 +0000 UTC m=+166.602695194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.380848 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.381585 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.88153188 +0000 UTC m=+166.704416656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.381725 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.382289 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.88227006 +0000 UTC m=+166.705154836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.482708 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.482920 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.982887451 +0000 UTC m=+166.805772187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.483073 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.483440 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:58.983432596 +0000 UTC m=+166.806317342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.583915 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.584138 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.084106228 +0000 UTC m=+166.906990974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.584324 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.584628 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.084610813 +0000 UTC m=+166.907495559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.685413 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.685669 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.185635095 +0000 UTC m=+167.008519851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.685754 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.686170 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.186159769 +0000 UTC m=+167.009044515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.786760 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.786994 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.286964835 +0000 UTC m=+167.109849581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.787131 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.787525 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.287504691 +0000 UTC m=+167.110389437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.824320 4834 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.860282 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:25:58 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:25:58 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:25:58 crc kubenswrapper[4834]: healthz check failed Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.860378 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.888289 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.888484 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.38843889 +0000 UTC m=+167.211323666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.888752 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.889099 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.389083748 +0000 UTC m=+167.211968494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.989521 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.989751 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.489694538 +0000 UTC m=+167.312579324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:58 crc kubenswrapper[4834]: I1008 22:25:58.989822 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:58 crc kubenswrapper[4834]: E1008 22:25:58.990275 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.490258464 +0000 UTC m=+167.313143210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.019173 4834 generic.go:334] "Generic (PLEG): container finished" podID="ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" containerID="efad30f15755ee0358c78d1e27d16c3b286b57c61937d3f08cdef90b89d1cc2c" exitCode=0 Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.019236 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bmkb" event={"ID":"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53","Type":"ContainerDied","Data":"efad30f15755ee0358c78d1e27d16c3b286b57c61937d3f08cdef90b89d1cc2c"} Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.092079 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.092374 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.592344396 +0000 UTC m=+167.415229142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.092630 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.093169 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.593157359 +0000 UTC m=+167.416042125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.195122 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.195727 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.695685953 +0000 UTC m=+167.518570709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.195901 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.196591 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.696582738 +0000 UTC m=+167.519467484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.297583 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.297903 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.797860527 +0000 UTC m=+167.620745283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.299936 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.300350 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.800333257 +0000 UTC m=+167.623218003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.401766 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.401968 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.901936804 +0000 UTC m=+167.724821550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.402441 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.402848 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:25:59.90283907 +0000 UTC m=+167.725723816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.503291 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.503485 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:26:00.0034476 +0000 UTC m=+167.826332356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.503926 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.504365 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:26:00.004353846 +0000 UTC m=+167.827238612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.605251 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.605533 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:26:00.105502182 +0000 UTC m=+167.928386938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.605738 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.606131 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:26:00.106118908 +0000 UTC m=+167.929003664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.706613 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.706847 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:26:00.206820311 +0000 UTC m=+168.029705057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.706941 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.707318 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:26:00.207310476 +0000 UTC m=+168.030195222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.750716 4834 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-08T22:25:58.824366714Z","Handler":null,"Name":""} Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.808894 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.809126 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-08 22:26:00.309090428 +0000 UTC m=+168.131975174 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.809284 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:25:59 crc kubenswrapper[4834]: E1008 22:25:59.809712 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-08 22:26:00.309702646 +0000 UTC m=+168.132587582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9n7fb" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.859960 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:25:59 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:25:59 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:25:59 crc kubenswrapper[4834]: healthz check failed Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.860032 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.872495 4834 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.872530 4834 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.911395 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.933913 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.984092 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:25:59 crc kubenswrapper[4834]: I1008 22:25:59.989814 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-qqn48" Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.014619 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.029365 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" event={"ID":"7c2cc088-ced0-4542-878b-48488976518a","Type":"ContainerStarted","Data":"91278bf809fc3aea25d6b33fa837f9729f1de25ed8e5586eb532f3797908348a"} Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.031631 4834 generic.go:334] "Generic (PLEG): container finished" podID="78932c45-91a2-46b0-9b9d-73f4c14d2706" containerID="76814a942d952b1b9751b93b76b4adefa891359bcc18c26ab2bbe7646bd13595" exitCode=0 Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.031700 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8nbhp" event={"ID":"78932c45-91a2-46b0-9b9d-73f4c14d2706","Type":"ContainerDied","Data":"76814a942d952b1b9751b93b76b4adefa891359bcc18c26ab2bbe7646bd13595"} Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.033113 4834 generic.go:334] "Generic (PLEG): container finished" podID="ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" containerID="8aa254da0edb162d281a049b8cc5c88a7ab2d2e46ee3ca666bb74cc3cfebeeb9" exitCode=0 Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.033192 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnhss" event={"ID":"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca","Type":"ContainerDied","Data":"8aa254da0edb162d281a049b8cc5c88a7ab2d2e46ee3ca666bb74cc3cfebeeb9"} Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.034936 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7a78eee7-8970-436a-94e9-985bbb0bee86","Type":"ContainerStarted","Data":"51923be1ea486d242b9248d08291fa9dd867420a7f8e3d60a3e0262e6addc373"} Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.091140 4834 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.091336 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.139490 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.13946788 podStartE2EDuration="4.13946788s" podCreationTimestamp="2025-10-08 22:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:26:00.134520572 +0000 UTC m=+167.957405328" watchObservedRunningTime="2025-10-08 22:26:00.13946788 +0000 UTC m=+167.962352636" Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.223341 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9n7fb\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.480962 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.712294 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9n7fb"] Oct 08 22:26:00 crc kubenswrapper[4834]: W1008 22:26:00.721929 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda975074f_5780_405c_bf73_36ebcaf7bb06.slice/crio-39750ffc8ecd8b545a37202e8fc325e20b685a90b4bc211e7e35fe66f213db32 WatchSource:0}: Error finding container 39750ffc8ecd8b545a37202e8fc325e20b685a90b4bc211e7e35fe66f213db32: Status 404 returned error can't find the container with id 39750ffc8ecd8b545a37202e8fc325e20b685a90b4bc211e7e35fe66f213db32 Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.862389 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:26:00 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:26:00 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:26:00 crc kubenswrapper[4834]: healthz check failed Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.862459 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:26:00 crc kubenswrapper[4834]: I1008 22:26:00.957817 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-r2fr4" Oct 08 22:26:01 crc kubenswrapper[4834]: I1008 22:26:01.043098 4834 generic.go:334] "Generic (PLEG): container finished" podID="7a78eee7-8970-436a-94e9-985bbb0bee86" containerID="51923be1ea486d242b9248d08291fa9dd867420a7f8e3d60a3e0262e6addc373" exitCode=0 Oct 08 22:26:01 crc kubenswrapper[4834]: I1008 22:26:01.043321 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7a78eee7-8970-436a-94e9-985bbb0bee86","Type":"ContainerDied","Data":"51923be1ea486d242b9248d08291fa9dd867420a7f8e3d60a3e0262e6addc373"} Oct 08 22:26:01 crc kubenswrapper[4834]: I1008 22:26:01.044746 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" event={"ID":"a975074f-5780-405c-bf73-36ebcaf7bb06","Type":"ContainerStarted","Data":"39750ffc8ecd8b545a37202e8fc325e20b685a90b4bc211e7e35fe66f213db32"} Oct 08 22:26:01 crc kubenswrapper[4834]: I1008 22:26:01.052745 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" event={"ID":"7c2cc088-ced0-4542-878b-48488976518a","Type":"ContainerStarted","Data":"a33dd7ca667196f6bd9ac9c0e3d669f14d8f431354e2f6813b68376dfa6ee095"} Oct 08 22:26:01 crc kubenswrapper[4834]: I1008 22:26:01.568764 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 08 22:26:01 crc kubenswrapper[4834]: I1008 22:26:01.860060 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:26:01 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:26:01 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:26:01 crc kubenswrapper[4834]: healthz check failed Oct 08 22:26:01 crc kubenswrapper[4834]: I1008 22:26:01.860163 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.064475 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" event={"ID":"a975074f-5780-405c-bf73-36ebcaf7bb06","Type":"ContainerStarted","Data":"82239ce17c538ed3be5f479870256ca57d482fe205e55d41489a809824bf45b8"} Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.131000 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-gxfrf" podStartSLOduration=20.130978069 podStartE2EDuration="20.130978069s" podCreationTimestamp="2025-10-08 22:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:26:02.127991314 +0000 UTC m=+169.950876060" watchObservedRunningTime="2025-10-08 22:26:02.130978069 +0000 UTC m=+169.953862815" Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.148249 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs\") pod \"network-metrics-daemon-g7fd8\" (UID: \"e266421d-b52e-42f9-a7db-88f09ba1c075\") " pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.157972 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e266421d-b52e-42f9-a7db-88f09ba1c075-metrics-certs\") pod \"network-metrics-daemon-g7fd8\" (UID: \"e266421d-b52e-42f9-a7db-88f09ba1c075\") " pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.188514 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7fd8" Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.387713 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.455648 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a78eee7-8970-436a-94e9-985bbb0bee86-kube-api-access\") pod \"7a78eee7-8970-436a-94e9-985bbb0bee86\" (UID: \"7a78eee7-8970-436a-94e9-985bbb0bee86\") " Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.455779 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a78eee7-8970-436a-94e9-985bbb0bee86-kubelet-dir\") pod \"7a78eee7-8970-436a-94e9-985bbb0bee86\" (UID: \"7a78eee7-8970-436a-94e9-985bbb0bee86\") " Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.456216 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a78eee7-8970-436a-94e9-985bbb0bee86-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7a78eee7-8970-436a-94e9-985bbb0bee86" (UID: "7a78eee7-8970-436a-94e9-985bbb0bee86"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.465422 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a78eee7-8970-436a-94e9-985bbb0bee86-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7a78eee7-8970-436a-94e9-985bbb0bee86" (UID: "7a78eee7-8970-436a-94e9-985bbb0bee86"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.489698 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g7fd8"] Oct 08 22:26:02 crc kubenswrapper[4834]: W1008 22:26:02.503869 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode266421d_b52e_42f9_a7db_88f09ba1c075.slice/crio-698eadc803cfb935d406454fe299c6cdae408b0f60421fd25976f3a15f7f9f1d WatchSource:0}: Error finding container 698eadc803cfb935d406454fe299c6cdae408b0f60421fd25976f3a15f7f9f1d: Status 404 returned error can't find the container with id 698eadc803cfb935d406454fe299c6cdae408b0f60421fd25976f3a15f7f9f1d Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.557327 4834 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a78eee7-8970-436a-94e9-985bbb0bee86-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.557824 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a78eee7-8970-436a-94e9-985bbb0bee86-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.860344 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:26:02 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:26:02 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:26:02 crc kubenswrapper[4834]: healthz check failed Oct 08 22:26:02 crc kubenswrapper[4834]: I1008 22:26:02.860451 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:26:03 crc kubenswrapper[4834]: I1008 22:26:03.072549 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" event={"ID":"e266421d-b52e-42f9-a7db-88f09ba1c075","Type":"ContainerStarted","Data":"698eadc803cfb935d406454fe299c6cdae408b0f60421fd25976f3a15f7f9f1d"} Oct 08 22:26:03 crc kubenswrapper[4834]: I1008 22:26:03.075534 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7a78eee7-8970-436a-94e9-985bbb0bee86","Type":"ContainerDied","Data":"3c7c60ad130c2ac7720a3e3105513eaf4e73dadfe9bc80d0e0a4cb4be57fed1d"} Oct 08 22:26:03 crc kubenswrapper[4834]: I1008 22:26:03.075570 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c7c60ad130c2ac7720a3e3105513eaf4e73dadfe9bc80d0e0a4cb4be57fed1d" Oct 08 22:26:03 crc kubenswrapper[4834]: I1008 22:26:03.075588 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 08 22:26:03 crc kubenswrapper[4834]: I1008 22:26:03.075919 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:26:03 crc kubenswrapper[4834]: I1008 22:26:03.575381 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" podStartSLOduration=144.575361139 podStartE2EDuration="2m24.575361139s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:26:03.100637881 +0000 UTC m=+170.923522627" watchObservedRunningTime="2025-10-08 22:26:03.575361139 +0000 UTC m=+171.398245885" Oct 08 22:26:03 crc kubenswrapper[4834]: I1008 22:26:03.862180 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:26:03 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:26:03 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:26:03 crc kubenswrapper[4834]: healthz check failed Oct 08 22:26:03 crc kubenswrapper[4834]: I1008 22:26:03.862251 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:26:04 crc kubenswrapper[4834]: I1008 22:26:04.086379 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" event={"ID":"e266421d-b52e-42f9-a7db-88f09ba1c075","Type":"ContainerStarted","Data":"1c534d16ce7baa0c8636beda9de5971138a33e946b073c2cfac01430e1b8be00"} Oct 08 22:26:04 crc kubenswrapper[4834]: I1008 22:26:04.861043 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:26:04 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:26:04 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:26:04 crc kubenswrapper[4834]: healthz check failed Oct 08 22:26:04 crc kubenswrapper[4834]: I1008 22:26:04.861115 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:26:05 crc kubenswrapper[4834]: I1008 22:26:05.035465 4834 patch_prober.go:28] interesting pod/console-f9d7485db-lbmrk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Oct 08 22:26:05 crc kubenswrapper[4834]: I1008 22:26:05.035940 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lbmrk" podUID="c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Oct 08 22:26:05 crc kubenswrapper[4834]: I1008 22:26:05.056418 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:26:05 crc kubenswrapper[4834]: I1008 22:26:05.056503 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:26:05 crc kubenswrapper[4834]: I1008 22:26:05.056521 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:26:05 crc kubenswrapper[4834]: I1008 22:26:05.056560 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:26:05 crc kubenswrapper[4834]: I1008 22:26:05.101095 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g7fd8" event={"ID":"e266421d-b52e-42f9-a7db-88f09ba1c075","Type":"ContainerStarted","Data":"4e62a1c3674972fcab92ea7805e9f80e74a1c66e0082510a9b80e3c6ce22bf2f"} Oct 08 22:26:05 crc kubenswrapper[4834]: I1008 22:26:05.119954 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-g7fd8" podStartSLOduration=146.119931548 podStartE2EDuration="2m26.119931548s" podCreationTimestamp="2025-10-08 22:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:26:05.117434058 +0000 UTC m=+172.940318804" watchObservedRunningTime="2025-10-08 22:26:05.119931548 +0000 UTC m=+172.942816294" Oct 08 22:26:05 crc kubenswrapper[4834]: I1008 22:26:05.858297 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:26:05 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:26:05 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:26:05 crc kubenswrapper[4834]: healthz check failed Oct 08 22:26:05 crc kubenswrapper[4834]: I1008 22:26:05.858375 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:26:06 crc kubenswrapper[4834]: I1008 22:26:06.860016 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:26:06 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:26:06 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:26:06 crc kubenswrapper[4834]: healthz check failed Oct 08 22:26:06 crc kubenswrapper[4834]: I1008 22:26:06.860076 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:26:07 crc kubenswrapper[4834]: I1008 22:26:07.858178 4834 patch_prober.go:28] interesting pod/router-default-5444994796-j595w container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 08 22:26:07 crc kubenswrapper[4834]: [-]has-synced failed: reason withheld Oct 08 22:26:07 crc kubenswrapper[4834]: [+]process-running ok Oct 08 22:26:07 crc kubenswrapper[4834]: healthz check failed Oct 08 22:26:07 crc kubenswrapper[4834]: I1008 22:26:07.858794 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-j595w" podUID="e72cb57e-d32b-4f1c-9e1d-7fad47d553a9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 08 22:26:08 crc kubenswrapper[4834]: I1008 22:26:08.868337 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:26:08 crc kubenswrapper[4834]: I1008 22:26:08.875679 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-j595w" Oct 08 22:26:15 crc kubenswrapper[4834]: I1008 22:26:15.035389 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:26:15 crc kubenswrapper[4834]: I1008 22:26:15.039904 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:26:15 crc kubenswrapper[4834]: I1008 22:26:15.047115 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:26:15 crc kubenswrapper[4834]: I1008 22:26:15.047212 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:26:15 crc kubenswrapper[4834]: I1008 22:26:15.047587 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:26:15 crc kubenswrapper[4834]: I1008 22:26:15.047660 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:26:15 crc kubenswrapper[4834]: I1008 22:26:15.047714 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-86m4g" Oct 08 22:26:15 crc kubenswrapper[4834]: I1008 22:26:15.048222 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:26:15 crc kubenswrapper[4834]: I1008 22:26:15.048267 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:26:15 crc kubenswrapper[4834]: I1008 22:26:15.048305 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"cf0501234783ba0bf8cb4f68dfe1c24c8765335b6f5c030f49a461f3aea2dfca"} pod="openshift-console/downloads-7954f5f757-86m4g" containerMessage="Container download-server failed liveness probe, will be restarted" Oct 08 22:26:15 crc kubenswrapper[4834]: I1008 22:26:15.048401 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" containerID="cri-o://cf0501234783ba0bf8cb4f68dfe1c24c8765335b6f5c030f49a461f3aea2dfca" gracePeriod=2 Oct 08 22:26:17 crc kubenswrapper[4834]: I1008 22:26:17.025589 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:26:17 crc kubenswrapper[4834]: I1008 22:26:17.026438 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:26:18 crc kubenswrapper[4834]: I1008 22:26:18.203868 4834 generic.go:334] "Generic (PLEG): container finished" podID="084bae12-5db3-49bc-b703-a694b692c215" containerID="cf0501234783ba0bf8cb4f68dfe1c24c8765335b6f5c030f49a461f3aea2dfca" exitCode=0 Oct 08 22:26:18 crc kubenswrapper[4834]: I1008 22:26:18.203940 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-86m4g" event={"ID":"084bae12-5db3-49bc-b703-a694b692c215","Type":"ContainerDied","Data":"cf0501234783ba0bf8cb4f68dfe1c24c8765335b6f5c030f49a461f3aea2dfca"} Oct 08 22:26:20 crc kubenswrapper[4834]: I1008 22:26:20.490879 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:26:21 crc kubenswrapper[4834]: I1008 22:26:21.920445 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 08 22:26:25 crc kubenswrapper[4834]: I1008 22:26:25.047285 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:26:25 crc kubenswrapper[4834]: I1008 22:26:25.048011 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:26:25 crc kubenswrapper[4834]: I1008 22:26:25.212587 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-l5mhx" Oct 08 22:26:35 crc kubenswrapper[4834]: I1008 22:26:35.046480 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:26:35 crc kubenswrapper[4834]: I1008 22:26:35.047096 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:26:45 crc kubenswrapper[4834]: I1008 22:26:45.047680 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:26:45 crc kubenswrapper[4834]: I1008 22:26:45.050453 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:26:47 crc kubenswrapper[4834]: I1008 22:26:47.025718 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:26:47 crc kubenswrapper[4834]: I1008 22:26:47.026332 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:26:47 crc kubenswrapper[4834]: I1008 22:26:47.026421 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:26:47 crc kubenswrapper[4834]: I1008 22:26:47.027566 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:26:47 crc kubenswrapper[4834]: I1008 22:26:47.027681 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784" gracePeriod=600 Oct 08 22:26:49 crc kubenswrapper[4834]: E1008 22:26:49.376090 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 08 22:26:49 crc kubenswrapper[4834]: E1008 22:26:49.376728 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kxb65,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hfv2p_openshift-marketplace(2db23b8a-4d46-47bb-8ed4-ad6747401463): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 22:26:49 crc kubenswrapper[4834]: E1008 22:26:49.377920 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hfv2p" podUID="2db23b8a-4d46-47bb-8ed4-ad6747401463" Oct 08 22:26:49 crc kubenswrapper[4834]: I1008 22:26:49.488766 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784" exitCode=0 Oct 08 22:26:49 crc kubenswrapper[4834]: I1008 22:26:49.489000 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784"} Oct 08 22:26:50 crc kubenswrapper[4834]: E1008 22:26:50.845326 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hfv2p" podUID="2db23b8a-4d46-47bb-8ed4-ad6747401463" Oct 08 22:26:50 crc kubenswrapper[4834]: E1008 22:26:50.910284 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 08 22:26:50 crc kubenswrapper[4834]: E1008 22:26:50.910436 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pzzd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-c42p9_openshift-marketplace(d19ec829-d35e-4acb-a5bb-8f2e81a67f4d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 22:26:50 crc kubenswrapper[4834]: E1008 22:26:50.912912 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-c42p9" podUID="d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" Oct 08 22:26:50 crc kubenswrapper[4834]: E1008 22:26:50.932380 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 08 22:26:50 crc kubenswrapper[4834]: E1008 22:26:50.932552 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6pcg8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9ct27_openshift-marketplace(edebf05e-6b29-4d72-9805-66328cac3d49): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 22:26:50 crc kubenswrapper[4834]: E1008 22:26:50.933987 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9ct27" podUID="edebf05e-6b29-4d72-9805-66328cac3d49" Oct 08 22:26:51 crc kubenswrapper[4834]: E1008 22:26:51.793470 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 08 22:26:51 crc kubenswrapper[4834]: E1008 22:26:51.793958 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4brzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9bmkb_openshift-marketplace(ca84cbfa-2a33-4adb-81b4-c6d80f0edd53): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 22:26:51 crc kubenswrapper[4834]: E1008 22:26:51.795211 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9bmkb" podUID="ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" Oct 08 22:26:51 crc kubenswrapper[4834]: E1008 22:26:51.807356 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 08 22:26:51 crc kubenswrapper[4834]: E1008 22:26:51.807811 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-krbzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-29sz4_openshift-marketplace(ee84722f-b4c1-40dd-bc05-1df112e48a98): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 22:26:51 crc kubenswrapper[4834]: E1008 22:26:51.809310 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-29sz4" podUID="ee84722f-b4c1-40dd-bc05-1df112e48a98" Oct 08 22:26:51 crc kubenswrapper[4834]: E1008 22:26:51.845318 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 08 22:26:51 crc kubenswrapper[4834]: E1008 22:26:51.845583 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9fnd8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sg2s9_openshift-marketplace(e106fe24-6e8b-4afc-be9f-19de99e7bb9b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 22:26:51 crc kubenswrapper[4834]: E1008 22:26:51.846922 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-sg2s9" podUID="e106fe24-6e8b-4afc-be9f-19de99e7bb9b" Oct 08 22:26:55 crc kubenswrapper[4834]: I1008 22:26:55.046801 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:26:55 crc kubenswrapper[4834]: I1008 22:26:55.047465 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:26:55 crc kubenswrapper[4834]: E1008 22:26:55.288651 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9bmkb" podUID="ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" Oct 08 22:26:55 crc kubenswrapper[4834]: E1008 22:26:55.320670 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 08 22:26:55 crc kubenswrapper[4834]: E1008 22:26:55.320824 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d78fw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-8nbhp_openshift-marketplace(78932c45-91a2-46b0-9b9d-73f4c14d2706): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 22:26:55 crc kubenswrapper[4834]: E1008 22:26:55.322181 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-8nbhp" podUID="78932c45-91a2-46b0-9b9d-73f4c14d2706" Oct 08 22:26:55 crc kubenswrapper[4834]: E1008 22:26:55.327914 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 08 22:26:55 crc kubenswrapper[4834]: E1008 22:26:55.328103 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wpj9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-nnhss_openshift-marketplace(ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 22:26:55 crc kubenswrapper[4834]: E1008 22:26:55.329443 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-nnhss" podUID="ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" Oct 08 22:26:55 crc kubenswrapper[4834]: E1008 22:26:55.535598 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-8nbhp" podUID="78932c45-91a2-46b0-9b9d-73f4c14d2706" Oct 08 22:26:55 crc kubenswrapper[4834]: E1008 22:26:55.535978 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-nnhss" podUID="ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" Oct 08 22:26:56 crc kubenswrapper[4834]: I1008 22:26:56.542092 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-86m4g" event={"ID":"084bae12-5db3-49bc-b703-a694b692c215","Type":"ContainerStarted","Data":"40abde782fc736433801318df38aa21b2d8d1ed031e157a946b783328787593d"} Oct 08 22:26:56 crc kubenswrapper[4834]: I1008 22:26:56.542690 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-86m4g" Oct 08 22:26:56 crc kubenswrapper[4834]: I1008 22:26:56.542859 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:26:56 crc kubenswrapper[4834]: I1008 22:26:56.542912 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:26:56 crc kubenswrapper[4834]: I1008 22:26:56.545829 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"719bf800125f02404a6274ba54e261065ace13a170b73466f709f454b2e76f47"} Oct 08 22:26:57 crc kubenswrapper[4834]: I1008 22:26:57.553669 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:26:57 crc kubenswrapper[4834]: I1008 22:26:57.553748 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:27:05 crc kubenswrapper[4834]: I1008 22:27:05.046696 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:27:05 crc kubenswrapper[4834]: I1008 22:27:05.046978 4834 patch_prober.go:28] interesting pod/downloads-7954f5f757-86m4g container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Oct 08 22:27:05 crc kubenswrapper[4834]: I1008 22:27:05.047258 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:27:05 crc kubenswrapper[4834]: I1008 22:27:05.047341 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-86m4g" podUID="084bae12-5db3-49bc-b703-a694b692c215" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Oct 08 22:27:14 crc kubenswrapper[4834]: I1008 22:27:14.688280 4834 generic.go:334] "Generic (PLEG): container finished" podID="2db23b8a-4d46-47bb-8ed4-ad6747401463" containerID="48864183c9bb0f95d315471adde9c7609ec2bbba9e9292387ea66c7ec5c5cc16" exitCode=0 Oct 08 22:27:14 crc kubenswrapper[4834]: I1008 22:27:14.689095 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfv2p" event={"ID":"2db23b8a-4d46-47bb-8ed4-ad6747401463","Type":"ContainerDied","Data":"48864183c9bb0f95d315471adde9c7609ec2bbba9e9292387ea66c7ec5c5cc16"} Oct 08 22:27:14 crc kubenswrapper[4834]: I1008 22:27:14.695348 4834 generic.go:334] "Generic (PLEG): container finished" podID="d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" containerID="dd7bdfc65c35456c3f538525249d2bea807d23892847ff30615f606905fe45a4" exitCode=0 Oct 08 22:27:14 crc kubenswrapper[4834]: I1008 22:27:14.695430 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c42p9" event={"ID":"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d","Type":"ContainerDied","Data":"dd7bdfc65c35456c3f538525249d2bea807d23892847ff30615f606905fe45a4"} Oct 08 22:27:14 crc kubenswrapper[4834]: I1008 22:27:14.698685 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnhss" event={"ID":"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca","Type":"ContainerStarted","Data":"3c37b6ba6d15794ed307c666b54c96c33853476cc067c3444d9a44df0bc4c8a3"} Oct 08 22:27:14 crc kubenswrapper[4834]: I1008 22:27:14.705921 4834 generic.go:334] "Generic (PLEG): container finished" podID="e106fe24-6e8b-4afc-be9f-19de99e7bb9b" containerID="68168a9814c4c4d210aba1b6d85f1b9dfebb16cf9f68b8f1edcceff1afa48a30" exitCode=0 Oct 08 22:27:14 crc kubenswrapper[4834]: I1008 22:27:14.706621 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sg2s9" event={"ID":"e106fe24-6e8b-4afc-be9f-19de99e7bb9b","Type":"ContainerDied","Data":"68168a9814c4c4d210aba1b6d85f1b9dfebb16cf9f68b8f1edcceff1afa48a30"} Oct 08 22:27:14 crc kubenswrapper[4834]: I1008 22:27:14.716849 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ct27" event={"ID":"edebf05e-6b29-4d72-9805-66328cac3d49","Type":"ContainerStarted","Data":"f32228ca4ef90d0ada4e141ee9d7e9f8e0d5823a5841a6de15832b63e88855eb"} Oct 08 22:27:14 crc kubenswrapper[4834]: I1008 22:27:14.728049 4834 generic.go:334] "Generic (PLEG): container finished" podID="ee84722f-b4c1-40dd-bc05-1df112e48a98" containerID="7c78b16c844782b66d6915dc98da3a1aa34d722675e18ec4bb3a4134b0401137" exitCode=0 Oct 08 22:27:14 crc kubenswrapper[4834]: I1008 22:27:14.728219 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29sz4" event={"ID":"ee84722f-b4c1-40dd-bc05-1df112e48a98","Type":"ContainerDied","Data":"7c78b16c844782b66d6915dc98da3a1aa34d722675e18ec4bb3a4134b0401137"} Oct 08 22:27:14 crc kubenswrapper[4834]: I1008 22:27:14.738595 4834 generic.go:334] "Generic (PLEG): container finished" podID="ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" containerID="71e788ab33ce0ce1bf4547f6388c6d05e948bbb5ce113bda8727c565f8e82a27" exitCode=0 Oct 08 22:27:14 crc kubenswrapper[4834]: I1008 22:27:14.738665 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bmkb" event={"ID":"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53","Type":"ContainerDied","Data":"71e788ab33ce0ce1bf4547f6388c6d05e948bbb5ce113bda8727c565f8e82a27"} Oct 08 22:27:15 crc kubenswrapper[4834]: I1008 22:27:15.067485 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-86m4g" Oct 08 22:27:15 crc kubenswrapper[4834]: I1008 22:27:15.757183 4834 generic.go:334] "Generic (PLEG): container finished" podID="ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" containerID="3c37b6ba6d15794ed307c666b54c96c33853476cc067c3444d9a44df0bc4c8a3" exitCode=0 Oct 08 22:27:15 crc kubenswrapper[4834]: I1008 22:27:15.757546 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnhss" event={"ID":"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca","Type":"ContainerDied","Data":"3c37b6ba6d15794ed307c666b54c96c33853476cc067c3444d9a44df0bc4c8a3"} Oct 08 22:27:15 crc kubenswrapper[4834]: I1008 22:27:15.769860 4834 generic.go:334] "Generic (PLEG): container finished" podID="edebf05e-6b29-4d72-9805-66328cac3d49" containerID="f32228ca4ef90d0ada4e141ee9d7e9f8e0d5823a5841a6de15832b63e88855eb" exitCode=0 Oct 08 22:27:15 crc kubenswrapper[4834]: I1008 22:27:15.769960 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ct27" event={"ID":"edebf05e-6b29-4d72-9805-66328cac3d49","Type":"ContainerDied","Data":"f32228ca4ef90d0ada4e141ee9d7e9f8e0d5823a5841a6de15832b63e88855eb"} Oct 08 22:27:15 crc kubenswrapper[4834]: I1008 22:27:15.786963 4834 generic.go:334] "Generic (PLEG): container finished" podID="78932c45-91a2-46b0-9b9d-73f4c14d2706" containerID="f9941a0864448e60b31497b55e69c7f6c3b9dd1a1f9f57eaf43cc1c36d3c96b0" exitCode=0 Oct 08 22:27:15 crc kubenswrapper[4834]: I1008 22:27:15.787012 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8nbhp" event={"ID":"78932c45-91a2-46b0-9b9d-73f4c14d2706","Type":"ContainerDied","Data":"f9941a0864448e60b31497b55e69c7f6c3b9dd1a1f9f57eaf43cc1c36d3c96b0"} Oct 08 22:27:37 crc kubenswrapper[4834]: I1008 22:27:37.934632 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29sz4" event={"ID":"ee84722f-b4c1-40dd-bc05-1df112e48a98","Type":"ContainerStarted","Data":"4e9a9d96690c225329edbf20d7b62a4e9f60585adb5ea7520b4fa3316f45b41d"} Oct 08 22:27:38 crc kubenswrapper[4834]: I1008 22:27:38.942673 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bmkb" event={"ID":"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53","Type":"ContainerStarted","Data":"81dcbd6101275e747897becca7ddf5fba8848a0d2d00607f90ddae5357f97b17"} Oct 08 22:27:38 crc kubenswrapper[4834]: I1008 22:27:38.944900 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfv2p" event={"ID":"2db23b8a-4d46-47bb-8ed4-ad6747401463","Type":"ContainerStarted","Data":"8d69c48824e0cd79ba851f88fcc5eb0cfff204f6528be1fd70693e8fa727923a"} Oct 08 22:27:38 crc kubenswrapper[4834]: I1008 22:27:38.947230 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8nbhp" event={"ID":"78932c45-91a2-46b0-9b9d-73f4c14d2706","Type":"ContainerStarted","Data":"b39361084f22b62b1c8a9ca262fdab52faef6d9512aed3ca181c61688c42358b"} Oct 08 22:27:38 crc kubenswrapper[4834]: I1008 22:27:38.949339 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c42p9" event={"ID":"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d","Type":"ContainerStarted","Data":"19e472b33d7ab5298e98d1ef17617ea3ab2d2f7c38b3b475e62c6928d1508a4a"} Oct 08 22:27:38 crc kubenswrapper[4834]: I1008 22:27:38.951519 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnhss" event={"ID":"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca","Type":"ContainerStarted","Data":"f20e17609487f88325b1e8856c7211535863fdb023bc199f1a77914eadc02204"} Oct 08 22:27:38 crc kubenswrapper[4834]: I1008 22:27:38.953735 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sg2s9" event={"ID":"e106fe24-6e8b-4afc-be9f-19de99e7bb9b","Type":"ContainerStarted","Data":"67a2b1ffac9f3996d48fe063037e6f51aaf3df02ff8e247625b8134f3274cd65"} Oct 08 22:27:38 crc kubenswrapper[4834]: I1008 22:27:38.955936 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ct27" event={"ID":"edebf05e-6b29-4d72-9805-66328cac3d49","Type":"ContainerStarted","Data":"050036223fe323ed90bc5e76e7189f5c0fc9ca230806d62def3b1fe6f30ebc8b"} Oct 08 22:27:38 crc kubenswrapper[4834]: I1008 22:27:38.967113 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-29sz4" podStartSLOduration=19.45073529 podStartE2EDuration="1m44.967089061s" podCreationTimestamp="2025-10-08 22:25:54 +0000 UTC" firstStartedPulling="2025-10-08 22:25:56.97821113 +0000 UTC m=+164.801095866" lastFinishedPulling="2025-10-08 22:27:22.494564881 +0000 UTC m=+250.317449637" observedRunningTime="2025-10-08 22:27:37.96851179 +0000 UTC m=+265.791396576" watchObservedRunningTime="2025-10-08 22:27:38.967089061 +0000 UTC m=+266.789973807" Oct 08 22:27:38 crc kubenswrapper[4834]: I1008 22:27:38.967718 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9bmkb" podStartSLOduration=3.935059639 podStartE2EDuration="1m42.967712677s" podCreationTimestamp="2025-10-08 22:25:56 +0000 UTC" firstStartedPulling="2025-10-08 22:25:59.024029591 +0000 UTC m=+166.846914367" lastFinishedPulling="2025-10-08 22:27:38.056682629 +0000 UTC m=+265.879567405" observedRunningTime="2025-10-08 22:27:38.9640101 +0000 UTC m=+266.786894846" watchObservedRunningTime="2025-10-08 22:27:38.967712677 +0000 UTC m=+266.790597423" Oct 08 22:27:39 crc kubenswrapper[4834]: I1008 22:27:39.013557 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sg2s9" podStartSLOduration=3.895490038 podStartE2EDuration="1m44.013532717s" podCreationTimestamp="2025-10-08 22:25:55 +0000 UTC" firstStartedPulling="2025-10-08 22:25:57.997617687 +0000 UTC m=+165.820502423" lastFinishedPulling="2025-10-08 22:27:38.115660316 +0000 UTC m=+265.938545102" observedRunningTime="2025-10-08 22:27:38.990936191 +0000 UTC m=+266.813820947" watchObservedRunningTime="2025-10-08 22:27:39.013532717 +0000 UTC m=+266.836417463" Oct 08 22:27:39 crc kubenswrapper[4834]: I1008 22:27:39.014124 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c42p9" podStartSLOduration=8.043461685 podStartE2EDuration="1m45.014119803s" podCreationTimestamp="2025-10-08 22:25:54 +0000 UTC" firstStartedPulling="2025-10-08 22:25:56.978793747 +0000 UTC m=+164.801678493" lastFinishedPulling="2025-10-08 22:27:33.949451825 +0000 UTC m=+261.772336611" observedRunningTime="2025-10-08 22:27:39.013417414 +0000 UTC m=+266.836302160" watchObservedRunningTime="2025-10-08 22:27:39.014119803 +0000 UTC m=+266.837004549" Oct 08 22:27:39 crc kubenswrapper[4834]: I1008 22:27:39.032642 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hfv2p" podStartSLOduration=3.882104277 podStartE2EDuration="1m46.032616542s" podCreationTimestamp="2025-10-08 22:25:53 +0000 UTC" firstStartedPulling="2025-10-08 22:25:55.965214083 +0000 UTC m=+163.788098849" lastFinishedPulling="2025-10-08 22:27:38.115726328 +0000 UTC m=+265.938611114" observedRunningTime="2025-10-08 22:27:39.031937633 +0000 UTC m=+266.854822379" watchObservedRunningTime="2025-10-08 22:27:39.032616542 +0000 UTC m=+266.855501288" Oct 08 22:27:39 crc kubenswrapper[4834]: I1008 22:27:39.065064 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nnhss" podStartSLOduration=5.018721669 podStartE2EDuration="1m43.065034568s" podCreationTimestamp="2025-10-08 22:25:56 +0000 UTC" firstStartedPulling="2025-10-08 22:26:00.034523868 +0000 UTC m=+167.857408614" lastFinishedPulling="2025-10-08 22:27:38.080836727 +0000 UTC m=+265.903721513" observedRunningTime="2025-10-08 22:27:39.061744211 +0000 UTC m=+266.884628957" watchObservedRunningTime="2025-10-08 22:27:39.065034568 +0000 UTC m=+266.887919314" Oct 08 22:27:39 crc kubenswrapper[4834]: I1008 22:27:39.084721 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9ct27" podStartSLOduration=12.635572365 podStartE2EDuration="1m46.084697957s" podCreationTimestamp="2025-10-08 22:25:53 +0000 UTC" firstStartedPulling="2025-10-08 22:25:56.979316741 +0000 UTC m=+164.802201477" lastFinishedPulling="2025-10-08 22:27:30.428442303 +0000 UTC m=+258.251327069" observedRunningTime="2025-10-08 22:27:39.081349259 +0000 UTC m=+266.904234005" watchObservedRunningTime="2025-10-08 22:27:39.084697957 +0000 UTC m=+266.907582703" Oct 08 22:27:39 crc kubenswrapper[4834]: I1008 22:27:39.104257 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8nbhp" podStartSLOduration=3.99527927 podStartE2EDuration="1m42.104236213s" podCreationTimestamp="2025-10-08 22:25:57 +0000 UTC" firstStartedPulling="2025-10-08 22:26:00.034095416 +0000 UTC m=+167.856980162" lastFinishedPulling="2025-10-08 22:27:38.143052329 +0000 UTC m=+265.965937105" observedRunningTime="2025-10-08 22:27:39.100973677 +0000 UTC m=+266.923858423" watchObservedRunningTime="2025-10-08 22:27:39.104236213 +0000 UTC m=+266.927120959" Oct 08 22:27:44 crc kubenswrapper[4834]: I1008 22:27:44.015562 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:27:44 crc kubenswrapper[4834]: I1008 22:27:44.018307 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:27:44 crc kubenswrapper[4834]: I1008 22:27:44.169546 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:27:44 crc kubenswrapper[4834]: I1008 22:27:44.178586 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:27:44 crc kubenswrapper[4834]: I1008 22:27:44.178615 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:27:44 crc kubenswrapper[4834]: I1008 22:27:44.233287 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:27:44 crc kubenswrapper[4834]: I1008 22:27:44.369337 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:27:44 crc kubenswrapper[4834]: I1008 22:27:44.369389 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:27:44 crc kubenswrapper[4834]: I1008 22:27:44.407006 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:27:44 crc kubenswrapper[4834]: I1008 22:27:44.583389 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:27:44 crc kubenswrapper[4834]: I1008 22:27:44.584331 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:27:44 crc kubenswrapper[4834]: I1008 22:27:44.627760 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:27:45 crc kubenswrapper[4834]: I1008 22:27:45.039054 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:27:45 crc kubenswrapper[4834]: I1008 22:27:45.049993 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:27:45 crc kubenswrapper[4834]: I1008 22:27:45.053175 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:27:45 crc kubenswrapper[4834]: I1008 22:27:45.053788 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:27:45 crc kubenswrapper[4834]: I1008 22:27:45.284343 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4v5sm"] Oct 08 22:27:45 crc kubenswrapper[4834]: I1008 22:27:45.978654 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c42p9"] Oct 08 22:27:46 crc kubenswrapper[4834]: I1008 22:27:46.086422 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:27:46 crc kubenswrapper[4834]: I1008 22:27:46.086469 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:27:46 crc kubenswrapper[4834]: I1008 22:27:46.128868 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:27:46 crc kubenswrapper[4834]: I1008 22:27:46.579526 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-29sz4"] Oct 08 22:27:46 crc kubenswrapper[4834]: I1008 22:27:46.582092 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:27:46 crc kubenswrapper[4834]: I1008 22:27:46.582519 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:27:46 crc kubenswrapper[4834]: I1008 22:27:46.626128 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.002245 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c42p9" podUID="d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" containerName="registry-server" containerID="cri-o://19e472b33d7ab5298e98d1ef17617ea3ab2d2f7c38b3b475e62c6928d1508a4a" gracePeriod=2 Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.048571 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.051124 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.221580 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.221626 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.270962 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.385009 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.460059 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzzd4\" (UniqueName: \"kubernetes.io/projected/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-kube-api-access-pzzd4\") pod \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\" (UID: \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\") " Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.465792 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-kube-api-access-pzzd4" (OuterVolumeSpecName: "kube-api-access-pzzd4") pod "d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" (UID: "d19ec829-d35e-4acb-a5bb-8f2e81a67f4d"). InnerVolumeSpecName "kube-api-access-pzzd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.561061 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-catalog-content\") pod \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\" (UID: \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\") " Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.561130 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-utilities\") pod \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\" (UID: \"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d\") " Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.561499 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzzd4\" (UniqueName: \"kubernetes.io/projected/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-kube-api-access-pzzd4\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.562356 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-utilities" (OuterVolumeSpecName: "utilities") pod "d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" (UID: "d19ec829-d35e-4acb-a5bb-8f2e81a67f4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.596717 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.596775 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.610845 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" (UID: "d19ec829-d35e-4acb-a5bb-8f2e81a67f4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.655993 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.662866 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:47 crc kubenswrapper[4834]: I1008 22:27:47.662897 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.011977 4834 generic.go:334] "Generic (PLEG): container finished" podID="d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" containerID="19e472b33d7ab5298e98d1ef17617ea3ab2d2f7c38b3b475e62c6928d1508a4a" exitCode=0 Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.012039 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c42p9" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.012118 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c42p9" event={"ID":"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d","Type":"ContainerDied","Data":"19e472b33d7ab5298e98d1ef17617ea3ab2d2f7c38b3b475e62c6928d1508a4a"} Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.012169 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c42p9" event={"ID":"d19ec829-d35e-4acb-a5bb-8f2e81a67f4d","Type":"ContainerDied","Data":"5cdf7979c6db74b949f781fb0c3fe9946e6a6b682d96d1886afe50de50dd0cdd"} Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.012191 4834 scope.go:117] "RemoveContainer" containerID="19e472b33d7ab5298e98d1ef17617ea3ab2d2f7c38b3b475e62c6928d1508a4a" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.013241 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-29sz4" podUID="ee84722f-b4c1-40dd-bc05-1df112e48a98" containerName="registry-server" containerID="cri-o://4e9a9d96690c225329edbf20d7b62a4e9f60585adb5ea7520b4fa3316f45b41d" gracePeriod=2 Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.034433 4834 scope.go:117] "RemoveContainer" containerID="dd7bdfc65c35456c3f538525249d2bea807d23892847ff30615f606905fe45a4" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.044435 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c42p9"] Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.051540 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c42p9"] Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.062564 4834 scope.go:117] "RemoveContainer" containerID="15242a93cdc3e69ba30e51580817d20c01c30a581b7ec90f8ef23a32ead5b5ce" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.065370 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.074038 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.114983 4834 scope.go:117] "RemoveContainer" containerID="19e472b33d7ab5298e98d1ef17617ea3ab2d2f7c38b3b475e62c6928d1508a4a" Oct 08 22:27:48 crc kubenswrapper[4834]: E1008 22:27:48.116700 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19e472b33d7ab5298e98d1ef17617ea3ab2d2f7c38b3b475e62c6928d1508a4a\": container with ID starting with 19e472b33d7ab5298e98d1ef17617ea3ab2d2f7c38b3b475e62c6928d1508a4a not found: ID does not exist" containerID="19e472b33d7ab5298e98d1ef17617ea3ab2d2f7c38b3b475e62c6928d1508a4a" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.116766 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19e472b33d7ab5298e98d1ef17617ea3ab2d2f7c38b3b475e62c6928d1508a4a"} err="failed to get container status \"19e472b33d7ab5298e98d1ef17617ea3ab2d2f7c38b3b475e62c6928d1508a4a\": rpc error: code = NotFound desc = could not find container \"19e472b33d7ab5298e98d1ef17617ea3ab2d2f7c38b3b475e62c6928d1508a4a\": container with ID starting with 19e472b33d7ab5298e98d1ef17617ea3ab2d2f7c38b3b475e62c6928d1508a4a not found: ID does not exist" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.116806 4834 scope.go:117] "RemoveContainer" containerID="dd7bdfc65c35456c3f538525249d2bea807d23892847ff30615f606905fe45a4" Oct 08 22:27:48 crc kubenswrapper[4834]: E1008 22:27:48.118382 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd7bdfc65c35456c3f538525249d2bea807d23892847ff30615f606905fe45a4\": container with ID starting with dd7bdfc65c35456c3f538525249d2bea807d23892847ff30615f606905fe45a4 not found: ID does not exist" containerID="dd7bdfc65c35456c3f538525249d2bea807d23892847ff30615f606905fe45a4" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.118417 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd7bdfc65c35456c3f538525249d2bea807d23892847ff30615f606905fe45a4"} err="failed to get container status \"dd7bdfc65c35456c3f538525249d2bea807d23892847ff30615f606905fe45a4\": rpc error: code = NotFound desc = could not find container \"dd7bdfc65c35456c3f538525249d2bea807d23892847ff30615f606905fe45a4\": container with ID starting with dd7bdfc65c35456c3f538525249d2bea807d23892847ff30615f606905fe45a4 not found: ID does not exist" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.118460 4834 scope.go:117] "RemoveContainer" containerID="15242a93cdc3e69ba30e51580817d20c01c30a581b7ec90f8ef23a32ead5b5ce" Oct 08 22:27:48 crc kubenswrapper[4834]: E1008 22:27:48.118981 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15242a93cdc3e69ba30e51580817d20c01c30a581b7ec90f8ef23a32ead5b5ce\": container with ID starting with 15242a93cdc3e69ba30e51580817d20c01c30a581b7ec90f8ef23a32ead5b5ce not found: ID does not exist" containerID="15242a93cdc3e69ba30e51580817d20c01c30a581b7ec90f8ef23a32ead5b5ce" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.119043 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15242a93cdc3e69ba30e51580817d20c01c30a581b7ec90f8ef23a32ead5b5ce"} err="failed to get container status \"15242a93cdc3e69ba30e51580817d20c01c30a581b7ec90f8ef23a32ead5b5ce\": rpc error: code = NotFound desc = could not find container \"15242a93cdc3e69ba30e51580817d20c01c30a581b7ec90f8ef23a32ead5b5ce\": container with ID starting with 15242a93cdc3e69ba30e51580817d20c01c30a581b7ec90f8ef23a32ead5b5ce not found: ID does not exist" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.393379 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.579047 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krbzl\" (UniqueName: \"kubernetes.io/projected/ee84722f-b4c1-40dd-bc05-1df112e48a98-kube-api-access-krbzl\") pod \"ee84722f-b4c1-40dd-bc05-1df112e48a98\" (UID: \"ee84722f-b4c1-40dd-bc05-1df112e48a98\") " Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.579256 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee84722f-b4c1-40dd-bc05-1df112e48a98-catalog-content\") pod \"ee84722f-b4c1-40dd-bc05-1df112e48a98\" (UID: \"ee84722f-b4c1-40dd-bc05-1df112e48a98\") " Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.579277 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee84722f-b4c1-40dd-bc05-1df112e48a98-utilities\") pod \"ee84722f-b4c1-40dd-bc05-1df112e48a98\" (UID: \"ee84722f-b4c1-40dd-bc05-1df112e48a98\") " Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.580336 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee84722f-b4c1-40dd-bc05-1df112e48a98-utilities" (OuterVolumeSpecName: "utilities") pod "ee84722f-b4c1-40dd-bc05-1df112e48a98" (UID: "ee84722f-b4c1-40dd-bc05-1df112e48a98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.588375 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee84722f-b4c1-40dd-bc05-1df112e48a98-kube-api-access-krbzl" (OuterVolumeSpecName: "kube-api-access-krbzl") pod "ee84722f-b4c1-40dd-bc05-1df112e48a98" (UID: "ee84722f-b4c1-40dd-bc05-1df112e48a98"). InnerVolumeSpecName "kube-api-access-krbzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.636138 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee84722f-b4c1-40dd-bc05-1df112e48a98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee84722f-b4c1-40dd-bc05-1df112e48a98" (UID: "ee84722f-b4c1-40dd-bc05-1df112e48a98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.681236 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee84722f-b4c1-40dd-bc05-1df112e48a98-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.681273 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee84722f-b4c1-40dd-bc05-1df112e48a98-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.681287 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krbzl\" (UniqueName: \"kubernetes.io/projected/ee84722f-b4c1-40dd-bc05-1df112e48a98-kube-api-access-krbzl\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:48 crc kubenswrapper[4834]: I1008 22:27:48.976324 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bmkb"] Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.018561 4834 generic.go:334] "Generic (PLEG): container finished" podID="ee84722f-b4c1-40dd-bc05-1df112e48a98" containerID="4e9a9d96690c225329edbf20d7b62a4e9f60585adb5ea7520b4fa3316f45b41d" exitCode=0 Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.018644 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29sz4" event={"ID":"ee84722f-b4c1-40dd-bc05-1df112e48a98","Type":"ContainerDied","Data":"4e9a9d96690c225329edbf20d7b62a4e9f60585adb5ea7520b4fa3316f45b41d"} Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.018679 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29sz4" event={"ID":"ee84722f-b4c1-40dd-bc05-1df112e48a98","Type":"ContainerDied","Data":"666777364ded21529fb548c4862e4dc9d41ec94189385c3dc575f75da4b0b2a0"} Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.018679 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29sz4" Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.018713 4834 scope.go:117] "RemoveContainer" containerID="4e9a9d96690c225329edbf20d7b62a4e9f60585adb5ea7520b4fa3316f45b41d" Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.039881 4834 scope.go:117] "RemoveContainer" containerID="7c78b16c844782b66d6915dc98da3a1aa34d722675e18ec4bb3a4134b0401137" Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.060572 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-29sz4"] Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.067295 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-29sz4"] Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.071330 4834 scope.go:117] "RemoveContainer" containerID="a4df8c06f8e4ae8029011e5bf9c8bf6a385339e70b63a2c80412f5a57575fe5c" Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.085411 4834 scope.go:117] "RemoveContainer" containerID="4e9a9d96690c225329edbf20d7b62a4e9f60585adb5ea7520b4fa3316f45b41d" Oct 08 22:27:49 crc kubenswrapper[4834]: E1008 22:27:49.086711 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e9a9d96690c225329edbf20d7b62a4e9f60585adb5ea7520b4fa3316f45b41d\": container with ID starting with 4e9a9d96690c225329edbf20d7b62a4e9f60585adb5ea7520b4fa3316f45b41d not found: ID does not exist" containerID="4e9a9d96690c225329edbf20d7b62a4e9f60585adb5ea7520b4fa3316f45b41d" Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.086763 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e9a9d96690c225329edbf20d7b62a4e9f60585adb5ea7520b4fa3316f45b41d"} err="failed to get container status \"4e9a9d96690c225329edbf20d7b62a4e9f60585adb5ea7520b4fa3316f45b41d\": rpc error: code = NotFound desc = could not find container \"4e9a9d96690c225329edbf20d7b62a4e9f60585adb5ea7520b4fa3316f45b41d\": container with ID starting with 4e9a9d96690c225329edbf20d7b62a4e9f60585adb5ea7520b4fa3316f45b41d not found: ID does not exist" Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.086792 4834 scope.go:117] "RemoveContainer" containerID="7c78b16c844782b66d6915dc98da3a1aa34d722675e18ec4bb3a4134b0401137" Oct 08 22:27:49 crc kubenswrapper[4834]: E1008 22:27:49.087431 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c78b16c844782b66d6915dc98da3a1aa34d722675e18ec4bb3a4134b0401137\": container with ID starting with 7c78b16c844782b66d6915dc98da3a1aa34d722675e18ec4bb3a4134b0401137 not found: ID does not exist" containerID="7c78b16c844782b66d6915dc98da3a1aa34d722675e18ec4bb3a4134b0401137" Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.087489 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c78b16c844782b66d6915dc98da3a1aa34d722675e18ec4bb3a4134b0401137"} err="failed to get container status \"7c78b16c844782b66d6915dc98da3a1aa34d722675e18ec4bb3a4134b0401137\": rpc error: code = NotFound desc = could not find container \"7c78b16c844782b66d6915dc98da3a1aa34d722675e18ec4bb3a4134b0401137\": container with ID starting with 7c78b16c844782b66d6915dc98da3a1aa34d722675e18ec4bb3a4134b0401137 not found: ID does not exist" Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.087515 4834 scope.go:117] "RemoveContainer" containerID="a4df8c06f8e4ae8029011e5bf9c8bf6a385339e70b63a2c80412f5a57575fe5c" Oct 08 22:27:49 crc kubenswrapper[4834]: E1008 22:27:49.087782 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4df8c06f8e4ae8029011e5bf9c8bf6a385339e70b63a2c80412f5a57575fe5c\": container with ID starting with a4df8c06f8e4ae8029011e5bf9c8bf6a385339e70b63a2c80412f5a57575fe5c not found: ID does not exist" containerID="a4df8c06f8e4ae8029011e5bf9c8bf6a385339e70b63a2c80412f5a57575fe5c" Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.087813 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4df8c06f8e4ae8029011e5bf9c8bf6a385339e70b63a2c80412f5a57575fe5c"} err="failed to get container status \"a4df8c06f8e4ae8029011e5bf9c8bf6a385339e70b63a2c80412f5a57575fe5c\": rpc error: code = NotFound desc = could not find container \"a4df8c06f8e4ae8029011e5bf9c8bf6a385339e70b63a2c80412f5a57575fe5c\": container with ID starting with a4df8c06f8e4ae8029011e5bf9c8bf6a385339e70b63a2c80412f5a57575fe5c not found: ID does not exist" Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.562937 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" path="/var/lib/kubelet/pods/d19ec829-d35e-4acb-a5bb-8f2e81a67f4d/volumes" Oct 08 22:27:49 crc kubenswrapper[4834]: I1008 22:27:49.563763 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee84722f-b4c1-40dd-bc05-1df112e48a98" path="/var/lib/kubelet/pods/ee84722f-b4c1-40dd-bc05-1df112e48a98/volumes" Oct 08 22:27:50 crc kubenswrapper[4834]: I1008 22:27:50.030987 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9bmkb" podUID="ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" containerName="registry-server" containerID="cri-o://81dcbd6101275e747897becca7ddf5fba8848a0d2d00607f90ddae5357f97b17" gracePeriod=2 Oct 08 22:27:50 crc kubenswrapper[4834]: I1008 22:27:50.425633 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:27:50 crc kubenswrapper[4834]: I1008 22:27:50.617230 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4brzp\" (UniqueName: \"kubernetes.io/projected/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-kube-api-access-4brzp\") pod \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\" (UID: \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\") " Oct 08 22:27:50 crc kubenswrapper[4834]: I1008 22:27:50.617352 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-utilities\") pod \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\" (UID: \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\") " Oct 08 22:27:50 crc kubenswrapper[4834]: I1008 22:27:50.617502 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-catalog-content\") pod \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\" (UID: \"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53\") " Oct 08 22:27:50 crc kubenswrapper[4834]: I1008 22:27:50.624288 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-utilities" (OuterVolumeSpecName: "utilities") pod "ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" (UID: "ca84cbfa-2a33-4adb-81b4-c6d80f0edd53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:27:50 crc kubenswrapper[4834]: I1008 22:27:50.625980 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-kube-api-access-4brzp" (OuterVolumeSpecName: "kube-api-access-4brzp") pod "ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" (UID: "ca84cbfa-2a33-4adb-81b4-c6d80f0edd53"). InnerVolumeSpecName "kube-api-access-4brzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:27:50 crc kubenswrapper[4834]: I1008 22:27:50.629401 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" (UID: "ca84cbfa-2a33-4adb-81b4-c6d80f0edd53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:27:50 crc kubenswrapper[4834]: I1008 22:27:50.719012 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:50 crc kubenswrapper[4834]: I1008 22:27:50.719056 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:50 crc kubenswrapper[4834]: I1008 22:27:50.719070 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4brzp\" (UniqueName: \"kubernetes.io/projected/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53-kube-api-access-4brzp\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.037807 4834 generic.go:334] "Generic (PLEG): container finished" podID="ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" containerID="81dcbd6101275e747897becca7ddf5fba8848a0d2d00607f90ddae5357f97b17" exitCode=0 Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.037853 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bmkb" event={"ID":"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53","Type":"ContainerDied","Data":"81dcbd6101275e747897becca7ddf5fba8848a0d2d00607f90ddae5357f97b17"} Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.037886 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bmkb" event={"ID":"ca84cbfa-2a33-4adb-81b4-c6d80f0edd53","Type":"ContainerDied","Data":"cadd03fe17a6a6292927b0b9604fecd06324905c7289dd59d68891170657aea1"} Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.037912 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bmkb" Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.037974 4834 scope.go:117] "RemoveContainer" containerID="81dcbd6101275e747897becca7ddf5fba8848a0d2d00607f90ddae5357f97b17" Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.054171 4834 scope.go:117] "RemoveContainer" containerID="71e788ab33ce0ce1bf4547f6388c6d05e948bbb5ce113bda8727c565f8e82a27" Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.065012 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bmkb"] Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.067492 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bmkb"] Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.086967 4834 scope.go:117] "RemoveContainer" containerID="efad30f15755ee0358c78d1e27d16c3b286b57c61937d3f08cdef90b89d1cc2c" Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.098315 4834 scope.go:117] "RemoveContainer" containerID="81dcbd6101275e747897becca7ddf5fba8848a0d2d00607f90ddae5357f97b17" Oct 08 22:27:51 crc kubenswrapper[4834]: E1008 22:27:51.098979 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81dcbd6101275e747897becca7ddf5fba8848a0d2d00607f90ddae5357f97b17\": container with ID starting with 81dcbd6101275e747897becca7ddf5fba8848a0d2d00607f90ddae5357f97b17 not found: ID does not exist" containerID="81dcbd6101275e747897becca7ddf5fba8848a0d2d00607f90ddae5357f97b17" Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.099030 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81dcbd6101275e747897becca7ddf5fba8848a0d2d00607f90ddae5357f97b17"} err="failed to get container status \"81dcbd6101275e747897becca7ddf5fba8848a0d2d00607f90ddae5357f97b17\": rpc error: code = NotFound desc = could not find container \"81dcbd6101275e747897becca7ddf5fba8848a0d2d00607f90ddae5357f97b17\": container with ID starting with 81dcbd6101275e747897becca7ddf5fba8848a0d2d00607f90ddae5357f97b17 not found: ID does not exist" Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.099062 4834 scope.go:117] "RemoveContainer" containerID="71e788ab33ce0ce1bf4547f6388c6d05e948bbb5ce113bda8727c565f8e82a27" Oct 08 22:27:51 crc kubenswrapper[4834]: E1008 22:27:51.099536 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71e788ab33ce0ce1bf4547f6388c6d05e948bbb5ce113bda8727c565f8e82a27\": container with ID starting with 71e788ab33ce0ce1bf4547f6388c6d05e948bbb5ce113bda8727c565f8e82a27 not found: ID does not exist" containerID="71e788ab33ce0ce1bf4547f6388c6d05e948bbb5ce113bda8727c565f8e82a27" Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.099559 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e788ab33ce0ce1bf4547f6388c6d05e948bbb5ce113bda8727c565f8e82a27"} err="failed to get container status \"71e788ab33ce0ce1bf4547f6388c6d05e948bbb5ce113bda8727c565f8e82a27\": rpc error: code = NotFound desc = could not find container \"71e788ab33ce0ce1bf4547f6388c6d05e948bbb5ce113bda8727c565f8e82a27\": container with ID starting with 71e788ab33ce0ce1bf4547f6388c6d05e948bbb5ce113bda8727c565f8e82a27 not found: ID does not exist" Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.099580 4834 scope.go:117] "RemoveContainer" containerID="efad30f15755ee0358c78d1e27d16c3b286b57c61937d3f08cdef90b89d1cc2c" Oct 08 22:27:51 crc kubenswrapper[4834]: E1008 22:27:51.099853 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efad30f15755ee0358c78d1e27d16c3b286b57c61937d3f08cdef90b89d1cc2c\": container with ID starting with efad30f15755ee0358c78d1e27d16c3b286b57c61937d3f08cdef90b89d1cc2c not found: ID does not exist" containerID="efad30f15755ee0358c78d1e27d16c3b286b57c61937d3f08cdef90b89d1cc2c" Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.099888 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efad30f15755ee0358c78d1e27d16c3b286b57c61937d3f08cdef90b89d1cc2c"} err="failed to get container status \"efad30f15755ee0358c78d1e27d16c3b286b57c61937d3f08cdef90b89d1cc2c\": rpc error: code = NotFound desc = could not find container \"efad30f15755ee0358c78d1e27d16c3b286b57c61937d3f08cdef90b89d1cc2c\": container with ID starting with efad30f15755ee0358c78d1e27d16c3b286b57c61937d3f08cdef90b89d1cc2c not found: ID does not exist" Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.375811 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8nbhp"] Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.376087 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8nbhp" podUID="78932c45-91a2-46b0-9b9d-73f4c14d2706" containerName="registry-server" containerID="cri-o://b39361084f22b62b1c8a9ca262fdab52faef6d9512aed3ca181c61688c42358b" gracePeriod=2 Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.571331 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" path="/var/lib/kubelet/pods/ca84cbfa-2a33-4adb-81b4-c6d80f0edd53/volumes" Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.745966 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.934210 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78932c45-91a2-46b0-9b9d-73f4c14d2706-catalog-content\") pod \"78932c45-91a2-46b0-9b9d-73f4c14d2706\" (UID: \"78932c45-91a2-46b0-9b9d-73f4c14d2706\") " Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.934314 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78932c45-91a2-46b0-9b9d-73f4c14d2706-utilities\") pod \"78932c45-91a2-46b0-9b9d-73f4c14d2706\" (UID: \"78932c45-91a2-46b0-9b9d-73f4c14d2706\") " Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.934350 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d78fw\" (UniqueName: \"kubernetes.io/projected/78932c45-91a2-46b0-9b9d-73f4c14d2706-kube-api-access-d78fw\") pod \"78932c45-91a2-46b0-9b9d-73f4c14d2706\" (UID: \"78932c45-91a2-46b0-9b9d-73f4c14d2706\") " Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.935217 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78932c45-91a2-46b0-9b9d-73f4c14d2706-utilities" (OuterVolumeSpecName: "utilities") pod "78932c45-91a2-46b0-9b9d-73f4c14d2706" (UID: "78932c45-91a2-46b0-9b9d-73f4c14d2706"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:27:51 crc kubenswrapper[4834]: I1008 22:27:51.941294 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78932c45-91a2-46b0-9b9d-73f4c14d2706-kube-api-access-d78fw" (OuterVolumeSpecName: "kube-api-access-d78fw") pod "78932c45-91a2-46b0-9b9d-73f4c14d2706" (UID: "78932c45-91a2-46b0-9b9d-73f4c14d2706"). InnerVolumeSpecName "kube-api-access-d78fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.016533 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78932c45-91a2-46b0-9b9d-73f4c14d2706-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78932c45-91a2-46b0-9b9d-73f4c14d2706" (UID: "78932c45-91a2-46b0-9b9d-73f4c14d2706"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.035181 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78932c45-91a2-46b0-9b9d-73f4c14d2706-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.035216 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78932c45-91a2-46b0-9b9d-73f4c14d2706-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.035227 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d78fw\" (UniqueName: \"kubernetes.io/projected/78932c45-91a2-46b0-9b9d-73f4c14d2706-kube-api-access-d78fw\") on node \"crc\" DevicePath \"\"" Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.047420 4834 generic.go:334] "Generic (PLEG): container finished" podID="78932c45-91a2-46b0-9b9d-73f4c14d2706" containerID="b39361084f22b62b1c8a9ca262fdab52faef6d9512aed3ca181c61688c42358b" exitCode=0 Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.047506 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8nbhp" event={"ID":"78932c45-91a2-46b0-9b9d-73f4c14d2706","Type":"ContainerDied","Data":"b39361084f22b62b1c8a9ca262fdab52faef6d9512aed3ca181c61688c42358b"} Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.047760 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8nbhp" event={"ID":"78932c45-91a2-46b0-9b9d-73f4c14d2706","Type":"ContainerDied","Data":"04843fc2450d652ccaf4271ba73dcd872adf5c1c3feb344afffa713ded265616"} Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.047785 4834 scope.go:117] "RemoveContainer" containerID="b39361084f22b62b1c8a9ca262fdab52faef6d9512aed3ca181c61688c42358b" Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.047516 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8nbhp" Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.078879 4834 scope.go:117] "RemoveContainer" containerID="f9941a0864448e60b31497b55e69c7f6c3b9dd1a1f9f57eaf43cc1c36d3c96b0" Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.084532 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8nbhp"] Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.085188 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8nbhp"] Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.106681 4834 scope.go:117] "RemoveContainer" containerID="76814a942d952b1b9751b93b76b4adefa891359bcc18c26ab2bbe7646bd13595" Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.125551 4834 scope.go:117] "RemoveContainer" containerID="b39361084f22b62b1c8a9ca262fdab52faef6d9512aed3ca181c61688c42358b" Oct 08 22:27:52 crc kubenswrapper[4834]: E1008 22:27:52.126023 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b39361084f22b62b1c8a9ca262fdab52faef6d9512aed3ca181c61688c42358b\": container with ID starting with b39361084f22b62b1c8a9ca262fdab52faef6d9512aed3ca181c61688c42358b not found: ID does not exist" containerID="b39361084f22b62b1c8a9ca262fdab52faef6d9512aed3ca181c61688c42358b" Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.126065 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b39361084f22b62b1c8a9ca262fdab52faef6d9512aed3ca181c61688c42358b"} err="failed to get container status \"b39361084f22b62b1c8a9ca262fdab52faef6d9512aed3ca181c61688c42358b\": rpc error: code = NotFound desc = could not find container \"b39361084f22b62b1c8a9ca262fdab52faef6d9512aed3ca181c61688c42358b\": container with ID starting with b39361084f22b62b1c8a9ca262fdab52faef6d9512aed3ca181c61688c42358b not found: ID does not exist" Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.126098 4834 scope.go:117] "RemoveContainer" containerID="f9941a0864448e60b31497b55e69c7f6c3b9dd1a1f9f57eaf43cc1c36d3c96b0" Oct 08 22:27:52 crc kubenswrapper[4834]: E1008 22:27:52.126548 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9941a0864448e60b31497b55e69c7f6c3b9dd1a1f9f57eaf43cc1c36d3c96b0\": container with ID starting with f9941a0864448e60b31497b55e69c7f6c3b9dd1a1f9f57eaf43cc1c36d3c96b0 not found: ID does not exist" containerID="f9941a0864448e60b31497b55e69c7f6c3b9dd1a1f9f57eaf43cc1c36d3c96b0" Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.126591 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9941a0864448e60b31497b55e69c7f6c3b9dd1a1f9f57eaf43cc1c36d3c96b0"} err="failed to get container status \"f9941a0864448e60b31497b55e69c7f6c3b9dd1a1f9f57eaf43cc1c36d3c96b0\": rpc error: code = NotFound desc = could not find container \"f9941a0864448e60b31497b55e69c7f6c3b9dd1a1f9f57eaf43cc1c36d3c96b0\": container with ID starting with f9941a0864448e60b31497b55e69c7f6c3b9dd1a1f9f57eaf43cc1c36d3c96b0 not found: ID does not exist" Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.126625 4834 scope.go:117] "RemoveContainer" containerID="76814a942d952b1b9751b93b76b4adefa891359bcc18c26ab2bbe7646bd13595" Oct 08 22:27:52 crc kubenswrapper[4834]: E1008 22:27:52.126955 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76814a942d952b1b9751b93b76b4adefa891359bcc18c26ab2bbe7646bd13595\": container with ID starting with 76814a942d952b1b9751b93b76b4adefa891359bcc18c26ab2bbe7646bd13595 not found: ID does not exist" containerID="76814a942d952b1b9751b93b76b4adefa891359bcc18c26ab2bbe7646bd13595" Oct 08 22:27:52 crc kubenswrapper[4834]: I1008 22:27:52.126986 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76814a942d952b1b9751b93b76b4adefa891359bcc18c26ab2bbe7646bd13595"} err="failed to get container status \"76814a942d952b1b9751b93b76b4adefa891359bcc18c26ab2bbe7646bd13595\": rpc error: code = NotFound desc = could not find container \"76814a942d952b1b9751b93b76b4adefa891359bcc18c26ab2bbe7646bd13595\": container with ID starting with 76814a942d952b1b9751b93b76b4adefa891359bcc18c26ab2bbe7646bd13595 not found: ID does not exist" Oct 08 22:27:53 crc kubenswrapper[4834]: I1008 22:27:53.568792 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78932c45-91a2-46b0-9b9d-73f4c14d2706" path="/var/lib/kubelet/pods/78932c45-91a2-46b0-9b9d-73f4c14d2706/volumes" Oct 08 22:28:10 crc kubenswrapper[4834]: I1008 22:28:10.325874 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" podUID="5496ae0a-5098-49eb-9a39-82e4d0c584bf" containerName="oauth-openshift" containerID="cri-o://b37624f0757bb446b6a7c569534fd9c58da95bfe6b70e3b620dcfb64bb9a69ce" gracePeriod=15 Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.195007 4834 generic.go:334] "Generic (PLEG): container finished" podID="5496ae0a-5098-49eb-9a39-82e4d0c584bf" containerID="b37624f0757bb446b6a7c569534fd9c58da95bfe6b70e3b620dcfb64bb9a69ce" exitCode=0 Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.195079 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" event={"ID":"5496ae0a-5098-49eb-9a39-82e4d0c584bf","Type":"ContainerDied","Data":"b37624f0757bb446b6a7c569534fd9c58da95bfe6b70e3b620dcfb64bb9a69ce"} Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.355399 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410195 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-79d99db75d-4qk6m"] Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410568 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06172a77-1bd6-447d-9276-c4bfd79efba9" containerName="pruner" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410593 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="06172a77-1bd6-447d-9276-c4bfd79efba9" containerName="pruner" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410611 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee84722f-b4c1-40dd-bc05-1df112e48a98" containerName="extract-utilities" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410624 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee84722f-b4c1-40dd-bc05-1df112e48a98" containerName="extract-utilities" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410643 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" containerName="extract-content" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410656 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" containerName="extract-content" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410679 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78932c45-91a2-46b0-9b9d-73f4c14d2706" containerName="registry-server" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410691 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="78932c45-91a2-46b0-9b9d-73f4c14d2706" containerName="registry-server" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410707 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" containerName="extract-content" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410719 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" containerName="extract-content" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410735 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78932c45-91a2-46b0-9b9d-73f4c14d2706" containerName="extract-utilities" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410748 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="78932c45-91a2-46b0-9b9d-73f4c14d2706" containerName="extract-utilities" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410765 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee84722f-b4c1-40dd-bc05-1df112e48a98" containerName="registry-server" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410777 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee84722f-b4c1-40dd-bc05-1df112e48a98" containerName="registry-server" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410796 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78932c45-91a2-46b0-9b9d-73f4c14d2706" containerName="extract-content" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410811 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="78932c45-91a2-46b0-9b9d-73f4c14d2706" containerName="extract-content" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410830 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" containerName="extract-utilities" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410842 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" containerName="extract-utilities" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410859 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" containerName="registry-server" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410871 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" containerName="registry-server" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410887 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee84722f-b4c1-40dd-bc05-1df112e48a98" containerName="extract-content" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410899 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee84722f-b4c1-40dd-bc05-1df112e48a98" containerName="extract-content" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410913 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5496ae0a-5098-49eb-9a39-82e4d0c584bf" containerName="oauth-openshift" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410925 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="5496ae0a-5098-49eb-9a39-82e4d0c584bf" containerName="oauth-openshift" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410941 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" containerName="extract-utilities" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410956 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" containerName="extract-utilities" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410972 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c302d50-f83a-448d-a914-905ec04ada98" containerName="collect-profiles" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.410985 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c302d50-f83a-448d-a914-905ec04ada98" containerName="collect-profiles" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.410997 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a78eee7-8970-436a-94e9-985bbb0bee86" containerName="pruner" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.411010 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a78eee7-8970-436a-94e9-985bbb0bee86" containerName="pruner" Oct 08 22:28:11 crc kubenswrapper[4834]: E1008 22:28:11.411027 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" containerName="registry-server" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.411039 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" containerName="registry-server" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.420651 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca84cbfa-2a33-4adb-81b4-c6d80f0edd53" containerName="registry-server" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.420711 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c302d50-f83a-448d-a914-905ec04ada98" containerName="collect-profiles" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.420740 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="06172a77-1bd6-447d-9276-c4bfd79efba9" containerName="pruner" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.420759 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="78932c45-91a2-46b0-9b9d-73f4c14d2706" containerName="registry-server" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.420784 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee84722f-b4c1-40dd-bc05-1df112e48a98" containerName="registry-server" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.420806 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d19ec829-d35e-4acb-a5bb-8f2e81a67f4d" containerName="registry-server" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.420828 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="5496ae0a-5098-49eb-9a39-82e4d0c584bf" containerName="oauth-openshift" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.420845 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a78eee7-8970-436a-94e9-985bbb0bee86" containerName="pruner" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.421689 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.426264 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79d99db75d-4qk6m"] Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.482470 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7lf9\" (UniqueName: \"kubernetes.io/projected/5496ae0a-5098-49eb-9a39-82e4d0c584bf-kube-api-access-h7lf9\") pod \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.482959 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5496ae0a-5098-49eb-9a39-82e4d0c584bf-audit-dir\") pod \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483016 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-trusted-ca-bundle\") pod \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483061 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-audit-policies\") pod \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483107 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-service-ca\") pod \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483160 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-error\") pod \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483208 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-session\") pod \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483257 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-router-certs\") pod \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483282 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-ocp-branding-template\") pod \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483308 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-login\") pod \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483383 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-cliconfig\") pod \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483424 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-provider-selection\") pod \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483509 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-idp-0-file-data\") pod \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483537 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-serving-cert\") pod \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\" (UID: \"5496ae0a-5098-49eb-9a39-82e4d0c584bf\") " Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483779 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483821 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6b11bd3c-9f9f-4401-82b1-94a5362b2361-audit-dir\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483849 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-session\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483842 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "5496ae0a-5098-49eb-9a39-82e4d0c584bf" (UID: "5496ae0a-5098-49eb-9a39-82e4d0c584bf"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483882 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483930 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483960 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483955 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "5496ae0a-5098-49eb-9a39-82e4d0c584bf" (UID: "5496ae0a-5098-49eb-9a39-82e4d0c584bf"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.484001 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.484032 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j7sw\" (UniqueName: \"kubernetes.io/projected/6b11bd3c-9f9f-4401-82b1-94a5362b2361-kube-api-access-4j7sw\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.484060 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.484103 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6b11bd3c-9f9f-4401-82b1-94a5362b2361-audit-policies\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.484133 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.484179 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-user-template-login\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.484210 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-user-template-error\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.484242 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.484300 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.484318 4834 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.483117 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5496ae0a-5098-49eb-9a39-82e4d0c584bf-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5496ae0a-5098-49eb-9a39-82e4d0c584bf" (UID: "5496ae0a-5098-49eb-9a39-82e4d0c584bf"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.485266 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "5496ae0a-5098-49eb-9a39-82e4d0c584bf" (UID: "5496ae0a-5098-49eb-9a39-82e4d0c584bf"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.486423 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "5496ae0a-5098-49eb-9a39-82e4d0c584bf" (UID: "5496ae0a-5098-49eb-9a39-82e4d0c584bf"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.491134 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "5496ae0a-5098-49eb-9a39-82e4d0c584bf" (UID: "5496ae0a-5098-49eb-9a39-82e4d0c584bf"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.493603 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "5496ae0a-5098-49eb-9a39-82e4d0c584bf" (UID: "5496ae0a-5098-49eb-9a39-82e4d0c584bf"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.494001 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "5496ae0a-5098-49eb-9a39-82e4d0c584bf" (UID: "5496ae0a-5098-49eb-9a39-82e4d0c584bf"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.499367 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5496ae0a-5098-49eb-9a39-82e4d0c584bf-kube-api-access-h7lf9" (OuterVolumeSpecName: "kube-api-access-h7lf9") pod "5496ae0a-5098-49eb-9a39-82e4d0c584bf" (UID: "5496ae0a-5098-49eb-9a39-82e4d0c584bf"). InnerVolumeSpecName "kube-api-access-h7lf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.499389 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "5496ae0a-5098-49eb-9a39-82e4d0c584bf" (UID: "5496ae0a-5098-49eb-9a39-82e4d0c584bf"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.499774 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "5496ae0a-5098-49eb-9a39-82e4d0c584bf" (UID: "5496ae0a-5098-49eb-9a39-82e4d0c584bf"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.499871 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "5496ae0a-5098-49eb-9a39-82e4d0c584bf" (UID: "5496ae0a-5098-49eb-9a39-82e4d0c584bf"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.500104 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "5496ae0a-5098-49eb-9a39-82e4d0c584bf" (UID: "5496ae0a-5098-49eb-9a39-82e4d0c584bf"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.501397 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "5496ae0a-5098-49eb-9a39-82e4d0c584bf" (UID: "5496ae0a-5098-49eb-9a39-82e4d0c584bf"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.585871 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.585948 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j7sw\" (UniqueName: \"kubernetes.io/projected/6b11bd3c-9f9f-4401-82b1-94a5362b2361-kube-api-access-4j7sw\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.585983 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586035 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6b11bd3c-9f9f-4401-82b1-94a5362b2361-audit-policies\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586068 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586099 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-user-template-login\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586134 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-user-template-error\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586189 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586238 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586277 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6b11bd3c-9f9f-4401-82b1-94a5362b2361-audit-dir\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586303 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-session\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586337 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586382 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586414 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586480 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586500 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586518 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586534 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586550 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586565 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586597 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586617 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586639 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586659 4834 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5496ae0a-5098-49eb-9a39-82e4d0c584bf-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586676 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7lf9\" (UniqueName: \"kubernetes.io/projected/5496ae0a-5098-49eb-9a39-82e4d0c584bf-kube-api-access-h7lf9\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.586691 4834 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5496ae0a-5098-49eb-9a39-82e4d0c584bf-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.587765 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.588474 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6b11bd3c-9f9f-4401-82b1-94a5362b2361-audit-dir\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.589355 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-service-ca\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.590501 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.590960 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.592017 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6b11bd3c-9f9f-4401-82b1-94a5362b2361-audit-policies\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.593943 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-router-certs\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.594665 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-user-template-login\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.596354 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.596881 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-session\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.597313 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.597560 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.599258 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6b11bd3c-9f9f-4401-82b1-94a5362b2361-v4-0-config-user-template-error\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.607378 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j7sw\" (UniqueName: \"kubernetes.io/projected/6b11bd3c-9f9f-4401-82b1-94a5362b2361-kube-api-access-4j7sw\") pod \"oauth-openshift-79d99db75d-4qk6m\" (UID: \"6b11bd3c-9f9f-4401-82b1-94a5362b2361\") " pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:11 crc kubenswrapper[4834]: I1008 22:28:11.744888 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:12 crc kubenswrapper[4834]: I1008 22:28:12.049075 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79d99db75d-4qk6m"] Oct 08 22:28:12 crc kubenswrapper[4834]: W1008 22:28:12.058469 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b11bd3c_9f9f_4401_82b1_94a5362b2361.slice/crio-2cbbb605cdb70b9a4b43ffb62f24e7a2947a6bcfdcc29273af962232b47e3db1 WatchSource:0}: Error finding container 2cbbb605cdb70b9a4b43ffb62f24e7a2947a6bcfdcc29273af962232b47e3db1: Status 404 returned error can't find the container with id 2cbbb605cdb70b9a4b43ffb62f24e7a2947a6bcfdcc29273af962232b47e3db1 Oct 08 22:28:12 crc kubenswrapper[4834]: I1008 22:28:12.207872 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" event={"ID":"5496ae0a-5098-49eb-9a39-82e4d0c584bf","Type":"ContainerDied","Data":"97f4d510d7a868e9a6aaecce0aab4f2d0911cda716a7ae4c013a4fd38a349287"} Oct 08 22:28:12 crc kubenswrapper[4834]: I1008 22:28:12.207960 4834 scope.go:117] "RemoveContainer" containerID="b37624f0757bb446b6a7c569534fd9c58da95bfe6b70e3b620dcfb64bb9a69ce" Oct 08 22:28:12 crc kubenswrapper[4834]: I1008 22:28:12.208038 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4v5sm" Oct 08 22:28:12 crc kubenswrapper[4834]: I1008 22:28:12.210562 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" event={"ID":"6b11bd3c-9f9f-4401-82b1-94a5362b2361","Type":"ContainerStarted","Data":"2cbbb605cdb70b9a4b43ffb62f24e7a2947a6bcfdcc29273af962232b47e3db1"} Oct 08 22:28:12 crc kubenswrapper[4834]: I1008 22:28:12.249510 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4v5sm"] Oct 08 22:28:12 crc kubenswrapper[4834]: I1008 22:28:12.249630 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4v5sm"] Oct 08 22:28:13 crc kubenswrapper[4834]: I1008 22:28:13.223472 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" event={"ID":"6b11bd3c-9f9f-4401-82b1-94a5362b2361","Type":"ContainerStarted","Data":"c964446267d346bd0ef0fccbee55eb9421522beaff73dd6b1f7606584d19c4d0"} Oct 08 22:28:13 crc kubenswrapper[4834]: I1008 22:28:13.223956 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:13 crc kubenswrapper[4834]: I1008 22:28:13.231962 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" Oct 08 22:28:13 crc kubenswrapper[4834]: I1008 22:28:13.260920 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-79d99db75d-4qk6m" podStartSLOduration=28.26088972 podStartE2EDuration="28.26088972s" podCreationTimestamp="2025-10-08 22:27:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:28:13.255838016 +0000 UTC m=+301.078722802" watchObservedRunningTime="2025-10-08 22:28:13.26088972 +0000 UTC m=+301.083774506" Oct 08 22:28:13 crc kubenswrapper[4834]: I1008 22:28:13.573573 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5496ae0a-5098-49eb-9a39-82e4d0c584bf" path="/var/lib/kubelet/pods/5496ae0a-5098-49eb-9a39-82e4d0c584bf/volumes" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.090777 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9ct27"] Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.091848 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9ct27" podUID="edebf05e-6b29-4d72-9805-66328cac3d49" containerName="registry-server" containerID="cri-o://050036223fe323ed90bc5e76e7189f5c0fc9ca230806d62def3b1fe6f30ebc8b" gracePeriod=30 Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.105839 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hfv2p"] Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.106200 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hfv2p" podUID="2db23b8a-4d46-47bb-8ed4-ad6747401463" containerName="registry-server" containerID="cri-o://8d69c48824e0cd79ba851f88fcc5eb0cfff204f6528be1fd70693e8fa727923a" gracePeriod=30 Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.114072 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xsxk5"] Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.114423 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" podUID="52cf1dbe-b7f9-44be-bc44-1308a5eb0471" containerName="marketplace-operator" containerID="cri-o://7d5966b68c73f70620059e69b283bf0edd5c22f4ef23211cb6e273c130a674da" gracePeriod=30 Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.126192 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sg2s9"] Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.126767 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sg2s9" podUID="e106fe24-6e8b-4afc-be9f-19de99e7bb9b" containerName="registry-server" containerID="cri-o://67a2b1ffac9f3996d48fe063037e6f51aaf3df02ff8e247625b8134f3274cd65" gracePeriod=30 Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.134606 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnhss"] Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.134966 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nnhss" podUID="ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" containerName="registry-server" containerID="cri-o://f20e17609487f88325b1e8856c7211535863fdb023bc199f1a77914eadc02204" gracePeriod=30 Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.141594 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kvxj2"] Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.142614 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.147389 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kvxj2"] Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.279757 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/209dd6f3-8823-4e04-8e83-100706400bc8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kvxj2\" (UID: \"209dd6f3-8823-4e04-8e83-100706400bc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.279865 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk4pf\" (UniqueName: \"kubernetes.io/projected/209dd6f3-8823-4e04-8e83-100706400bc8-kube-api-access-gk4pf\") pod \"marketplace-operator-79b997595-kvxj2\" (UID: \"209dd6f3-8823-4e04-8e83-100706400bc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.279939 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/209dd6f3-8823-4e04-8e83-100706400bc8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kvxj2\" (UID: \"209dd6f3-8823-4e04-8e83-100706400bc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.363955 4834 generic.go:334] "Generic (PLEG): container finished" podID="2db23b8a-4d46-47bb-8ed4-ad6747401463" containerID="8d69c48824e0cd79ba851f88fcc5eb0cfff204f6528be1fd70693e8fa727923a" exitCode=0 Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.364031 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfv2p" event={"ID":"2db23b8a-4d46-47bb-8ed4-ad6747401463","Type":"ContainerDied","Data":"8d69c48824e0cd79ba851f88fcc5eb0cfff204f6528be1fd70693e8fa727923a"} Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.368950 4834 generic.go:334] "Generic (PLEG): container finished" podID="ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" containerID="f20e17609487f88325b1e8856c7211535863fdb023bc199f1a77914eadc02204" exitCode=0 Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.369033 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnhss" event={"ID":"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca","Type":"ContainerDied","Data":"f20e17609487f88325b1e8856c7211535863fdb023bc199f1a77914eadc02204"} Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.381389 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk4pf\" (UniqueName: \"kubernetes.io/projected/209dd6f3-8823-4e04-8e83-100706400bc8-kube-api-access-gk4pf\") pod \"marketplace-operator-79b997595-kvxj2\" (UID: \"209dd6f3-8823-4e04-8e83-100706400bc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.381450 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/209dd6f3-8823-4e04-8e83-100706400bc8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kvxj2\" (UID: \"209dd6f3-8823-4e04-8e83-100706400bc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.381483 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/209dd6f3-8823-4e04-8e83-100706400bc8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kvxj2\" (UID: \"209dd6f3-8823-4e04-8e83-100706400bc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.383851 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/209dd6f3-8823-4e04-8e83-100706400bc8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kvxj2\" (UID: \"209dd6f3-8823-4e04-8e83-100706400bc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.390020 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/209dd6f3-8823-4e04-8e83-100706400bc8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kvxj2\" (UID: \"209dd6f3-8823-4e04-8e83-100706400bc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.390939 4834 generic.go:334] "Generic (PLEG): container finished" podID="e106fe24-6e8b-4afc-be9f-19de99e7bb9b" containerID="67a2b1ffac9f3996d48fe063037e6f51aaf3df02ff8e247625b8134f3274cd65" exitCode=0 Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.391044 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sg2s9" event={"ID":"e106fe24-6e8b-4afc-be9f-19de99e7bb9b","Type":"ContainerDied","Data":"67a2b1ffac9f3996d48fe063037e6f51aaf3df02ff8e247625b8134f3274cd65"} Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.403359 4834 generic.go:334] "Generic (PLEG): container finished" podID="edebf05e-6b29-4d72-9805-66328cac3d49" containerID="050036223fe323ed90bc5e76e7189f5c0fc9ca230806d62def3b1fe6f30ebc8b" exitCode=0 Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.403528 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ct27" event={"ID":"edebf05e-6b29-4d72-9805-66328cac3d49","Type":"ContainerDied","Data":"050036223fe323ed90bc5e76e7189f5c0fc9ca230806d62def3b1fe6f30ebc8b"} Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.405654 4834 generic.go:334] "Generic (PLEG): container finished" podID="52cf1dbe-b7f9-44be-bc44-1308a5eb0471" containerID="7d5966b68c73f70620059e69b283bf0edd5c22f4ef23211cb6e273c130a674da" exitCode=0 Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.405717 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" event={"ID":"52cf1dbe-b7f9-44be-bc44-1308a5eb0471","Type":"ContainerDied","Data":"7d5966b68c73f70620059e69b283bf0edd5c22f4ef23211cb6e273c130a674da"} Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.411040 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk4pf\" (UniqueName: \"kubernetes.io/projected/209dd6f3-8823-4e04-8e83-100706400bc8-kube-api-access-gk4pf\") pod \"marketplace-operator-79b997595-kvxj2\" (UID: \"209dd6f3-8823-4e04-8e83-100706400bc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.467698 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.598484 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.600959 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.611525 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.612113 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.623426 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.688319 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxb65\" (UniqueName: \"kubernetes.io/projected/2db23b8a-4d46-47bb-8ed4-ad6747401463-kube-api-access-kxb65\") pod \"2db23b8a-4d46-47bb-8ed4-ad6747401463\" (UID: \"2db23b8a-4d46-47bb-8ed4-ad6747401463\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.688401 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db23b8a-4d46-47bb-8ed4-ad6747401463-utilities\") pod \"2db23b8a-4d46-47bb-8ed4-ad6747401463\" (UID: \"2db23b8a-4d46-47bb-8ed4-ad6747401463\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.690206 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-utilities\") pod \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\" (UID: \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.690339 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db23b8a-4d46-47bb-8ed4-ad6747401463-utilities" (OuterVolumeSpecName: "utilities") pod "2db23b8a-4d46-47bb-8ed4-ad6747401463" (UID: "2db23b8a-4d46-47bb-8ed4-ad6747401463"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.690413 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-catalog-content\") pod \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\" (UID: \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.690493 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fnd8\" (UniqueName: \"kubernetes.io/projected/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-kube-api-access-9fnd8\") pod \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\" (UID: \"e106fe24-6e8b-4afc-be9f-19de99e7bb9b\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.693967 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db23b8a-4d46-47bb-8ed4-ad6747401463-catalog-content\") pod \"2db23b8a-4d46-47bb-8ed4-ad6747401463\" (UID: \"2db23b8a-4d46-47bb-8ed4-ad6747401463\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.692617 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-utilities" (OuterVolumeSpecName: "utilities") pod "e106fe24-6e8b-4afc-be9f-19de99e7bb9b" (UID: "e106fe24-6e8b-4afc-be9f-19de99e7bb9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.699489 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db23b8a-4d46-47bb-8ed4-ad6747401463-kube-api-access-kxb65" (OuterVolumeSpecName: "kube-api-access-kxb65") pod "2db23b8a-4d46-47bb-8ed4-ad6747401463" (UID: "2db23b8a-4d46-47bb-8ed4-ad6747401463"). InnerVolumeSpecName "kube-api-access-kxb65". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.702694 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxb65\" (UniqueName: \"kubernetes.io/projected/2db23b8a-4d46-47bb-8ed4-ad6747401463-kube-api-access-kxb65\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.702745 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db23b8a-4d46-47bb-8ed4-ad6747401463-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.702760 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.708932 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e106fe24-6e8b-4afc-be9f-19de99e7bb9b" (UID: "e106fe24-6e8b-4afc-be9f-19de99e7bb9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.716003 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-kube-api-access-9fnd8" (OuterVolumeSpecName: "kube-api-access-9fnd8") pod "e106fe24-6e8b-4afc-be9f-19de99e7bb9b" (UID: "e106fe24-6e8b-4afc-be9f-19de99e7bb9b"). InnerVolumeSpecName "kube-api-access-9fnd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.793232 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kvxj2"] Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.804635 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtkg4\" (UniqueName: \"kubernetes.io/projected/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-kube-api-access-dtkg4\") pod \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\" (UID: \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.804696 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pcg8\" (UniqueName: \"kubernetes.io/projected/edebf05e-6b29-4d72-9805-66328cac3d49-kube-api-access-6pcg8\") pod \"edebf05e-6b29-4d72-9805-66328cac3d49\" (UID: \"edebf05e-6b29-4d72-9805-66328cac3d49\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.804732 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-marketplace-trusted-ca\") pod \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\" (UID: \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.804753 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edebf05e-6b29-4d72-9805-66328cac3d49-utilities\") pod \"edebf05e-6b29-4d72-9805-66328cac3d49\" (UID: \"edebf05e-6b29-4d72-9805-66328cac3d49\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.804858 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edebf05e-6b29-4d72-9805-66328cac3d49-catalog-content\") pod \"edebf05e-6b29-4d72-9805-66328cac3d49\" (UID: \"edebf05e-6b29-4d72-9805-66328cac3d49\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.804905 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpj9w\" (UniqueName: \"kubernetes.io/projected/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-kube-api-access-wpj9w\") pod \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\" (UID: \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.804928 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-utilities\") pod \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\" (UID: \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.804962 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-catalog-content\") pod \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\" (UID: \"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.805026 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-marketplace-operator-metrics\") pod \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\" (UID: \"52cf1dbe-b7f9-44be-bc44-1308a5eb0471\") " Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.805281 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.805293 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fnd8\" (UniqueName: \"kubernetes.io/projected/e106fe24-6e8b-4afc-be9f-19de99e7bb9b-kube-api-access-9fnd8\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.805303 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db23b8a-4d46-47bb-8ed4-ad6747401463-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2db23b8a-4d46-47bb-8ed4-ad6747401463" (UID: "2db23b8a-4d46-47bb-8ed4-ad6747401463"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.807410 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edebf05e-6b29-4d72-9805-66328cac3d49-utilities" (OuterVolumeSpecName: "utilities") pod "edebf05e-6b29-4d72-9805-66328cac3d49" (UID: "edebf05e-6b29-4d72-9805-66328cac3d49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.810717 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "52cf1dbe-b7f9-44be-bc44-1308a5eb0471" (UID: "52cf1dbe-b7f9-44be-bc44-1308a5eb0471"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.811090 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-utilities" (OuterVolumeSpecName: "utilities") pod "ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" (UID: "ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.812333 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-kube-api-access-wpj9w" (OuterVolumeSpecName: "kube-api-access-wpj9w") pod "ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" (UID: "ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca"). InnerVolumeSpecName "kube-api-access-wpj9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.813585 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "52cf1dbe-b7f9-44be-bc44-1308a5eb0471" (UID: "52cf1dbe-b7f9-44be-bc44-1308a5eb0471"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.815589 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edebf05e-6b29-4d72-9805-66328cac3d49-kube-api-access-6pcg8" (OuterVolumeSpecName: "kube-api-access-6pcg8") pod "edebf05e-6b29-4d72-9805-66328cac3d49" (UID: "edebf05e-6b29-4d72-9805-66328cac3d49"). InnerVolumeSpecName "kube-api-access-6pcg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.816370 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-kube-api-access-dtkg4" (OuterVolumeSpecName: "kube-api-access-dtkg4") pod "52cf1dbe-b7f9-44be-bc44-1308a5eb0471" (UID: "52cf1dbe-b7f9-44be-bc44-1308a5eb0471"). InnerVolumeSpecName "kube-api-access-dtkg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.900007 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" (UID: "ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.906680 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpj9w\" (UniqueName: \"kubernetes.io/projected/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-kube-api-access-wpj9w\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.906710 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.906722 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.906731 4834 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.906740 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtkg4\" (UniqueName: \"kubernetes.io/projected/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-kube-api-access-dtkg4\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.906749 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pcg8\" (UniqueName: \"kubernetes.io/projected/edebf05e-6b29-4d72-9805-66328cac3d49-kube-api-access-6pcg8\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.906757 4834 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52cf1dbe-b7f9-44be-bc44-1308a5eb0471-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.906767 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edebf05e-6b29-4d72-9805-66328cac3d49-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.906778 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db23b8a-4d46-47bb-8ed4-ad6747401463-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:33 crc kubenswrapper[4834]: I1008 22:28:33.911637 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edebf05e-6b29-4d72-9805-66328cac3d49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edebf05e-6b29-4d72-9805-66328cac3d49" (UID: "edebf05e-6b29-4d72-9805-66328cac3d49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.007969 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edebf05e-6b29-4d72-9805-66328cac3d49-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.413913 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" event={"ID":"209dd6f3-8823-4e04-8e83-100706400bc8","Type":"ContainerStarted","Data":"0dfbaeb4ea19a7289d8282d12a748fbbecde369234cc58bbe8094babfa43b9b5"} Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.413990 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" event={"ID":"209dd6f3-8823-4e04-8e83-100706400bc8","Type":"ContainerStarted","Data":"619fedd6144e34c5c55474bbbabc3f6cb0aedac8d84cc330a32825bd23ff828f"} Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.414546 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.418116 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.419499 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nnhss" event={"ID":"ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca","Type":"ContainerDied","Data":"52b561330621f4e787efc0b7047b42b477a53109a9782097b61371c3efa56f84"} Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.419573 4834 scope.go:117] "RemoveContainer" containerID="f20e17609487f88325b1e8856c7211535863fdb023bc199f1a77914eadc02204" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.419587 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nnhss" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.424846 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sg2s9" event={"ID":"e106fe24-6e8b-4afc-be9f-19de99e7bb9b","Type":"ContainerDied","Data":"d285dc12d3d0264c4fd821e3a39521fd200a4157b8cff4cf22037d45ecc0f07f"} Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.425209 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sg2s9" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.427883 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9ct27" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.428180 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9ct27" event={"ID":"edebf05e-6b29-4d72-9805-66328cac3d49","Type":"ContainerDied","Data":"411b58f88129e2a07e63223431d3333ae8abc74abbc78d4b0b5c0b348bb72bc9"} Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.429897 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" event={"ID":"52cf1dbe-b7f9-44be-bc44-1308a5eb0471","Type":"ContainerDied","Data":"811698e2b7c0e80996357578420c2866beeccdf11593e2fdc798ee40d6756a7d"} Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.430032 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xsxk5" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.434346 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kvxj2" podStartSLOduration=1.434327642 podStartE2EDuration="1.434327642s" podCreationTimestamp="2025-10-08 22:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:28:34.43160474 +0000 UTC m=+322.254489486" watchObservedRunningTime="2025-10-08 22:28:34.434327642 +0000 UTC m=+322.257212388" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.447431 4834 scope.go:117] "RemoveContainer" containerID="3c37b6ba6d15794ed307c666b54c96c33853476cc067c3444d9a44df0bc4c8a3" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.447603 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hfv2p" event={"ID":"2db23b8a-4d46-47bb-8ed4-ad6747401463","Type":"ContainerDied","Data":"2f1bca872d66dacd4e0c589e7f6f0f9a30402ad3a74447768ce7627b5be169f8"} Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.447734 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hfv2p" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.483178 4834 scope.go:117] "RemoveContainer" containerID="8aa254da0edb162d281a049b8cc5c88a7ab2d2e46ee3ca666bb74cc3cfebeeb9" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.500395 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nnhss"] Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.504981 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nnhss"] Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.518301 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xsxk5"] Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.522541 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xsxk5"] Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.533234 4834 scope.go:117] "RemoveContainer" containerID="67a2b1ffac9f3996d48fe063037e6f51aaf3df02ff8e247625b8134f3274cd65" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.541186 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sg2s9"] Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.551342 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sg2s9"] Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.554342 4834 scope.go:117] "RemoveContainer" containerID="68168a9814c4c4d210aba1b6d85f1b9dfebb16cf9f68b8f1edcceff1afa48a30" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.555643 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hfv2p"] Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.565391 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hfv2p"] Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.569483 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9ct27"] Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.585333 4834 scope.go:117] "RemoveContainer" containerID="a33f2f4c3b4778d0b0e07fefbb6b1e12588f34bd8c7cca990576326084e90e9c" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.585479 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9ct27"] Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.602435 4834 scope.go:117] "RemoveContainer" containerID="050036223fe323ed90bc5e76e7189f5c0fc9ca230806d62def3b1fe6f30ebc8b" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.620133 4834 scope.go:117] "RemoveContainer" containerID="f32228ca4ef90d0ada4e141ee9d7e9f8e0d5823a5841a6de15832b63e88855eb" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.634665 4834 scope.go:117] "RemoveContainer" containerID="0aaeed775a847f54b20a5c164d8a5aada090cf80ce272f9edcd74ef77a5379b0" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.647610 4834 scope.go:117] "RemoveContainer" containerID="7d5966b68c73f70620059e69b283bf0edd5c22f4ef23211cb6e273c130a674da" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.661359 4834 scope.go:117] "RemoveContainer" containerID="8d69c48824e0cd79ba851f88fcc5eb0cfff204f6528be1fd70693e8fa727923a" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.674785 4834 scope.go:117] "RemoveContainer" containerID="48864183c9bb0f95d315471adde9c7609ec2bbba9e9292387ea66c7ec5c5cc16" Oct 08 22:28:34 crc kubenswrapper[4834]: I1008 22:28:34.689451 4834 scope.go:117] "RemoveContainer" containerID="157676f7b443e0637a234fa3eadb56a071012dc7292d28101b32adecf02c979e" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.311579 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q6hqr"] Oct 08 22:28:35 crc kubenswrapper[4834]: E1008 22:28:35.312385 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edebf05e-6b29-4d72-9805-66328cac3d49" containerName="extract-content" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312434 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="edebf05e-6b29-4d72-9805-66328cac3d49" containerName="extract-content" Oct 08 22:28:35 crc kubenswrapper[4834]: E1008 22:28:35.312449 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" containerName="registry-server" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312459 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" containerName="registry-server" Oct 08 22:28:35 crc kubenswrapper[4834]: E1008 22:28:35.312477 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52cf1dbe-b7f9-44be-bc44-1308a5eb0471" containerName="marketplace-operator" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312485 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cf1dbe-b7f9-44be-bc44-1308a5eb0471" containerName="marketplace-operator" Oct 08 22:28:35 crc kubenswrapper[4834]: E1008 22:28:35.312499 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db23b8a-4d46-47bb-8ed4-ad6747401463" containerName="registry-server" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312507 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db23b8a-4d46-47bb-8ed4-ad6747401463" containerName="registry-server" Oct 08 22:28:35 crc kubenswrapper[4834]: E1008 22:28:35.312518 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e106fe24-6e8b-4afc-be9f-19de99e7bb9b" containerName="extract-content" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312527 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e106fe24-6e8b-4afc-be9f-19de99e7bb9b" containerName="extract-content" Oct 08 22:28:35 crc kubenswrapper[4834]: E1008 22:28:35.312539 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e106fe24-6e8b-4afc-be9f-19de99e7bb9b" containerName="extract-utilities" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312547 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e106fe24-6e8b-4afc-be9f-19de99e7bb9b" containerName="extract-utilities" Oct 08 22:28:35 crc kubenswrapper[4834]: E1008 22:28:35.312559 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db23b8a-4d46-47bb-8ed4-ad6747401463" containerName="extract-utilities" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312566 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db23b8a-4d46-47bb-8ed4-ad6747401463" containerName="extract-utilities" Oct 08 22:28:35 crc kubenswrapper[4834]: E1008 22:28:35.312575 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edebf05e-6b29-4d72-9805-66328cac3d49" containerName="extract-utilities" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312583 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="edebf05e-6b29-4d72-9805-66328cac3d49" containerName="extract-utilities" Oct 08 22:28:35 crc kubenswrapper[4834]: E1008 22:28:35.312592 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" containerName="extract-utilities" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312600 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" containerName="extract-utilities" Oct 08 22:28:35 crc kubenswrapper[4834]: E1008 22:28:35.312617 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edebf05e-6b29-4d72-9805-66328cac3d49" containerName="registry-server" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312625 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="edebf05e-6b29-4d72-9805-66328cac3d49" containerName="registry-server" Oct 08 22:28:35 crc kubenswrapper[4834]: E1008 22:28:35.312634 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db23b8a-4d46-47bb-8ed4-ad6747401463" containerName="extract-content" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312641 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db23b8a-4d46-47bb-8ed4-ad6747401463" containerName="extract-content" Oct 08 22:28:35 crc kubenswrapper[4834]: E1008 22:28:35.312652 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" containerName="extract-content" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312658 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" containerName="extract-content" Oct 08 22:28:35 crc kubenswrapper[4834]: E1008 22:28:35.312670 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e106fe24-6e8b-4afc-be9f-19de99e7bb9b" containerName="registry-server" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312678 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e106fe24-6e8b-4afc-be9f-19de99e7bb9b" containerName="registry-server" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312808 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="edebf05e-6b29-4d72-9805-66328cac3d49" containerName="registry-server" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312833 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db23b8a-4d46-47bb-8ed4-ad6747401463" containerName="registry-server" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312843 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e106fe24-6e8b-4afc-be9f-19de99e7bb9b" containerName="registry-server" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312857 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="52cf1dbe-b7f9-44be-bc44-1308a5eb0471" containerName="marketplace-operator" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.312868 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" containerName="registry-server" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.313934 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.316196 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.327327 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6hqr"] Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.431381 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057cb81c-17fd-4e22-8098-ea576e358559-utilities\") pod \"redhat-marketplace-q6hqr\" (UID: \"057cb81c-17fd-4e22-8098-ea576e358559\") " pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.431440 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j57s\" (UniqueName: \"kubernetes.io/projected/057cb81c-17fd-4e22-8098-ea576e358559-kube-api-access-6j57s\") pod \"redhat-marketplace-q6hqr\" (UID: \"057cb81c-17fd-4e22-8098-ea576e358559\") " pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.431500 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057cb81c-17fd-4e22-8098-ea576e358559-catalog-content\") pod \"redhat-marketplace-q6hqr\" (UID: \"057cb81c-17fd-4e22-8098-ea576e358559\") " pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.510357 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sg4bs"] Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.511962 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.514700 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.519131 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sg4bs"] Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.536931 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057cb81c-17fd-4e22-8098-ea576e358559-utilities\") pod \"redhat-marketplace-q6hqr\" (UID: \"057cb81c-17fd-4e22-8098-ea576e358559\") " pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.537018 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkd54\" (UniqueName: \"kubernetes.io/projected/3482aeed-66bd-4fe3-81d6-c12cdab7f9d9-kube-api-access-rkd54\") pod \"redhat-operators-sg4bs\" (UID: \"3482aeed-66bd-4fe3-81d6-c12cdab7f9d9\") " pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.537073 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j57s\" (UniqueName: \"kubernetes.io/projected/057cb81c-17fd-4e22-8098-ea576e358559-kube-api-access-6j57s\") pod \"redhat-marketplace-q6hqr\" (UID: \"057cb81c-17fd-4e22-8098-ea576e358559\") " pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.537116 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3482aeed-66bd-4fe3-81d6-c12cdab7f9d9-utilities\") pod \"redhat-operators-sg4bs\" (UID: \"3482aeed-66bd-4fe3-81d6-c12cdab7f9d9\") " pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.537246 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057cb81c-17fd-4e22-8098-ea576e358559-catalog-content\") pod \"redhat-marketplace-q6hqr\" (UID: \"057cb81c-17fd-4e22-8098-ea576e358559\") " pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.537365 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3482aeed-66bd-4fe3-81d6-c12cdab7f9d9-catalog-content\") pod \"redhat-operators-sg4bs\" (UID: \"3482aeed-66bd-4fe3-81d6-c12cdab7f9d9\") " pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.539951 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/057cb81c-17fd-4e22-8098-ea576e358559-utilities\") pod \"redhat-marketplace-q6hqr\" (UID: \"057cb81c-17fd-4e22-8098-ea576e358559\") " pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.542984 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/057cb81c-17fd-4e22-8098-ea576e358559-catalog-content\") pod \"redhat-marketplace-q6hqr\" (UID: \"057cb81c-17fd-4e22-8098-ea576e358559\") " pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.562740 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db23b8a-4d46-47bb-8ed4-ad6747401463" path="/var/lib/kubelet/pods/2db23b8a-4d46-47bb-8ed4-ad6747401463/volumes" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.563444 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52cf1dbe-b7f9-44be-bc44-1308a5eb0471" path="/var/lib/kubelet/pods/52cf1dbe-b7f9-44be-bc44-1308a5eb0471/volumes" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.563922 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca" path="/var/lib/kubelet/pods/ac0f7f20-65ec-45a1-94f9-9ab9b86ba6ca/volumes" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.565083 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e106fe24-6e8b-4afc-be9f-19de99e7bb9b" path="/var/lib/kubelet/pods/e106fe24-6e8b-4afc-be9f-19de99e7bb9b/volumes" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.565652 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j57s\" (UniqueName: \"kubernetes.io/projected/057cb81c-17fd-4e22-8098-ea576e358559-kube-api-access-6j57s\") pod \"redhat-marketplace-q6hqr\" (UID: \"057cb81c-17fd-4e22-8098-ea576e358559\") " pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.565906 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edebf05e-6b29-4d72-9805-66328cac3d49" path="/var/lib/kubelet/pods/edebf05e-6b29-4d72-9805-66328cac3d49/volumes" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.638462 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3482aeed-66bd-4fe3-81d6-c12cdab7f9d9-catalog-content\") pod \"redhat-operators-sg4bs\" (UID: \"3482aeed-66bd-4fe3-81d6-c12cdab7f9d9\") " pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.638558 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkd54\" (UniqueName: \"kubernetes.io/projected/3482aeed-66bd-4fe3-81d6-c12cdab7f9d9-kube-api-access-rkd54\") pod \"redhat-operators-sg4bs\" (UID: \"3482aeed-66bd-4fe3-81d6-c12cdab7f9d9\") " pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.638600 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3482aeed-66bd-4fe3-81d6-c12cdab7f9d9-utilities\") pod \"redhat-operators-sg4bs\" (UID: \"3482aeed-66bd-4fe3-81d6-c12cdab7f9d9\") " pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.639786 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3482aeed-66bd-4fe3-81d6-c12cdab7f9d9-catalog-content\") pod \"redhat-operators-sg4bs\" (UID: \"3482aeed-66bd-4fe3-81d6-c12cdab7f9d9\") " pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.640666 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3482aeed-66bd-4fe3-81d6-c12cdab7f9d9-utilities\") pod \"redhat-operators-sg4bs\" (UID: \"3482aeed-66bd-4fe3-81d6-c12cdab7f9d9\") " pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.658468 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkd54\" (UniqueName: \"kubernetes.io/projected/3482aeed-66bd-4fe3-81d6-c12cdab7f9d9-kube-api-access-rkd54\") pod \"redhat-operators-sg4bs\" (UID: \"3482aeed-66bd-4fe3-81d6-c12cdab7f9d9\") " pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.672203 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.840156 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:35 crc kubenswrapper[4834]: I1008 22:28:35.863008 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q6hqr"] Oct 08 22:28:35 crc kubenswrapper[4834]: W1008 22:28:35.866877 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod057cb81c_17fd_4e22_8098_ea576e358559.slice/crio-fcd43a3c92ba93c551f215dacddf21636491ea7756207ad561fa8d9ecfad6e41 WatchSource:0}: Error finding container fcd43a3c92ba93c551f215dacddf21636491ea7756207ad561fa8d9ecfad6e41: Status 404 returned error can't find the container with id fcd43a3c92ba93c551f215dacddf21636491ea7756207ad561fa8d9ecfad6e41 Oct 08 22:28:36 crc kubenswrapper[4834]: I1008 22:28:36.035581 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sg4bs"] Oct 08 22:28:36 crc kubenswrapper[4834]: W1008 22:28:36.101704 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3482aeed_66bd_4fe3_81d6_c12cdab7f9d9.slice/crio-1898b2d9a91fc848f03601d5d1d08c940f17814e30bdfea68fe23de6cabbc35b WatchSource:0}: Error finding container 1898b2d9a91fc848f03601d5d1d08c940f17814e30bdfea68fe23de6cabbc35b: Status 404 returned error can't find the container with id 1898b2d9a91fc848f03601d5d1d08c940f17814e30bdfea68fe23de6cabbc35b Oct 08 22:28:36 crc kubenswrapper[4834]: I1008 22:28:36.467838 4834 generic.go:334] "Generic (PLEG): container finished" podID="057cb81c-17fd-4e22-8098-ea576e358559" containerID="a6835eea0969ba13da0da099125ca130dd36e4f98a132722fce76e829cf5d0fe" exitCode=0 Oct 08 22:28:36 crc kubenswrapper[4834]: I1008 22:28:36.467907 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6hqr" event={"ID":"057cb81c-17fd-4e22-8098-ea576e358559","Type":"ContainerDied","Data":"a6835eea0969ba13da0da099125ca130dd36e4f98a132722fce76e829cf5d0fe"} Oct 08 22:28:36 crc kubenswrapper[4834]: I1008 22:28:36.469043 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6hqr" event={"ID":"057cb81c-17fd-4e22-8098-ea576e358559","Type":"ContainerStarted","Data":"fcd43a3c92ba93c551f215dacddf21636491ea7756207ad561fa8d9ecfad6e41"} Oct 08 22:28:36 crc kubenswrapper[4834]: I1008 22:28:36.470896 4834 generic.go:334] "Generic (PLEG): container finished" podID="3482aeed-66bd-4fe3-81d6-c12cdab7f9d9" containerID="4564e39e2e8b9315beeda37a8470e393fc050e2bcec5d3d6b650671a86bfd7d4" exitCode=0 Oct 08 22:28:36 crc kubenswrapper[4834]: I1008 22:28:36.471013 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg4bs" event={"ID":"3482aeed-66bd-4fe3-81d6-c12cdab7f9d9","Type":"ContainerDied","Data":"4564e39e2e8b9315beeda37a8470e393fc050e2bcec5d3d6b650671a86bfd7d4"} Oct 08 22:28:36 crc kubenswrapper[4834]: I1008 22:28:36.471069 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg4bs" event={"ID":"3482aeed-66bd-4fe3-81d6-c12cdab7f9d9","Type":"ContainerStarted","Data":"1898b2d9a91fc848f03601d5d1d08c940f17814e30bdfea68fe23de6cabbc35b"} Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.710566 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-npg7j"] Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.713237 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.715925 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.727101 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-npg7j"] Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.865801 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe-utilities\") pod \"certified-operators-npg7j\" (UID: \"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe\") " pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.866283 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe-catalog-content\") pod \"certified-operators-npg7j\" (UID: \"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe\") " pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.866612 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gwwt\" (UniqueName: \"kubernetes.io/projected/f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe-kube-api-access-6gwwt\") pod \"certified-operators-npg7j\" (UID: \"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe\") " pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.917711 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wvvps"] Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.932057 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wvvps"] Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.932279 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.936345 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.967958 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe-catalog-content\") pod \"certified-operators-npg7j\" (UID: \"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe\") " pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.968017 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gwwt\" (UniqueName: \"kubernetes.io/projected/f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe-kube-api-access-6gwwt\") pod \"certified-operators-npg7j\" (UID: \"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe\") " pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.968064 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe-utilities\") pod \"certified-operators-npg7j\" (UID: \"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe\") " pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.968556 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe-utilities\") pod \"certified-operators-npg7j\" (UID: \"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe\") " pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:37 crc kubenswrapper[4834]: I1008 22:28:37.968825 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe-catalog-content\") pod \"certified-operators-npg7j\" (UID: \"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe\") " pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.000338 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gwwt\" (UniqueName: \"kubernetes.io/projected/f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe-kube-api-access-6gwwt\") pod \"certified-operators-npg7j\" (UID: \"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe\") " pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.043695 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.069672 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7th8x\" (UniqueName: \"kubernetes.io/projected/f2fd386d-6b62-4282-9d8b-7325d579e8cb-kube-api-access-7th8x\") pod \"community-operators-wvvps\" (UID: \"f2fd386d-6b62-4282-9d8b-7325d579e8cb\") " pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.069730 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2fd386d-6b62-4282-9d8b-7325d579e8cb-catalog-content\") pod \"community-operators-wvvps\" (UID: \"f2fd386d-6b62-4282-9d8b-7325d579e8cb\") " pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.070058 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2fd386d-6b62-4282-9d8b-7325d579e8cb-utilities\") pod \"community-operators-wvvps\" (UID: \"f2fd386d-6b62-4282-9d8b-7325d579e8cb\") " pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.171282 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2fd386d-6b62-4282-9d8b-7325d579e8cb-utilities\") pod \"community-operators-wvvps\" (UID: \"f2fd386d-6b62-4282-9d8b-7325d579e8cb\") " pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.171375 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7th8x\" (UniqueName: \"kubernetes.io/projected/f2fd386d-6b62-4282-9d8b-7325d579e8cb-kube-api-access-7th8x\") pod \"community-operators-wvvps\" (UID: \"f2fd386d-6b62-4282-9d8b-7325d579e8cb\") " pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.171401 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2fd386d-6b62-4282-9d8b-7325d579e8cb-catalog-content\") pod \"community-operators-wvvps\" (UID: \"f2fd386d-6b62-4282-9d8b-7325d579e8cb\") " pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.172041 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2fd386d-6b62-4282-9d8b-7325d579e8cb-catalog-content\") pod \"community-operators-wvvps\" (UID: \"f2fd386d-6b62-4282-9d8b-7325d579e8cb\") " pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.172220 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2fd386d-6b62-4282-9d8b-7325d579e8cb-utilities\") pod \"community-operators-wvvps\" (UID: \"f2fd386d-6b62-4282-9d8b-7325d579e8cb\") " pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.192775 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7th8x\" (UniqueName: \"kubernetes.io/projected/f2fd386d-6b62-4282-9d8b-7325d579e8cb-kube-api-access-7th8x\") pod \"community-operators-wvvps\" (UID: \"f2fd386d-6b62-4282-9d8b-7325d579e8cb\") " pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.264271 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.302798 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-npg7j"] Oct 08 22:28:38 crc kubenswrapper[4834]: W1008 22:28:38.313356 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf137d3dc_1cb4_4d3b_a1a6_4c918ef2ddfe.slice/crio-ecc13be71c53ecece75bcc8d982d946765bf90f5d40401424abc6b46b990afb5 WatchSource:0}: Error finding container ecc13be71c53ecece75bcc8d982d946765bf90f5d40401424abc6b46b990afb5: Status 404 returned error can't find the container with id ecc13be71c53ecece75bcc8d982d946765bf90f5d40401424abc6b46b990afb5 Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.474469 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wvvps"] Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.487507 4834 generic.go:334] "Generic (PLEG): container finished" podID="3482aeed-66bd-4fe3-81d6-c12cdab7f9d9" containerID="0e8fa66bc01a5ac61cc12ccab554f97d0f7fbf3d349d56dec563791faf2b1365" exitCode=0 Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.487572 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg4bs" event={"ID":"3482aeed-66bd-4fe3-81d6-c12cdab7f9d9","Type":"ContainerDied","Data":"0e8fa66bc01a5ac61cc12ccab554f97d0f7fbf3d349d56dec563791faf2b1365"} Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.492393 4834 generic.go:334] "Generic (PLEG): container finished" podID="057cb81c-17fd-4e22-8098-ea576e358559" containerID="c25682ce4e842b60d57b6556c872f8e862341a75fea238f56ee6b2533ce208ae" exitCode=0 Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.492464 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6hqr" event={"ID":"057cb81c-17fd-4e22-8098-ea576e358559","Type":"ContainerDied","Data":"c25682ce4e842b60d57b6556c872f8e862341a75fea238f56ee6b2533ce208ae"} Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.496518 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npg7j" event={"ID":"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe","Type":"ContainerStarted","Data":"af4304b8045a95c2c9a8dff17c7f7ded36749d0228a00842ff01c5d64d731495"} Oct 08 22:28:38 crc kubenswrapper[4834]: I1008 22:28:38.496567 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npg7j" event={"ID":"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe","Type":"ContainerStarted","Data":"ecc13be71c53ecece75bcc8d982d946765bf90f5d40401424abc6b46b990afb5"} Oct 08 22:28:39 crc kubenswrapper[4834]: I1008 22:28:39.504273 4834 generic.go:334] "Generic (PLEG): container finished" podID="f2fd386d-6b62-4282-9d8b-7325d579e8cb" containerID="dbad34611f2ee859854ade027efa5e566ded62ec77c2f24e960720c66ee21ac6" exitCode=0 Oct 08 22:28:39 crc kubenswrapper[4834]: I1008 22:28:39.504408 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvvps" event={"ID":"f2fd386d-6b62-4282-9d8b-7325d579e8cb","Type":"ContainerDied","Data":"dbad34611f2ee859854ade027efa5e566ded62ec77c2f24e960720c66ee21ac6"} Oct 08 22:28:39 crc kubenswrapper[4834]: I1008 22:28:39.504743 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvvps" event={"ID":"f2fd386d-6b62-4282-9d8b-7325d579e8cb","Type":"ContainerStarted","Data":"7ce1c13808268929165c133a181db5b71349ebe1f6498472cb0e1d90c7654112"} Oct 08 22:28:39 crc kubenswrapper[4834]: I1008 22:28:39.507211 4834 generic.go:334] "Generic (PLEG): container finished" podID="f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe" containerID="af4304b8045a95c2c9a8dff17c7f7ded36749d0228a00842ff01c5d64d731495" exitCode=0 Oct 08 22:28:39 crc kubenswrapper[4834]: I1008 22:28:39.507288 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npg7j" event={"ID":"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe","Type":"ContainerDied","Data":"af4304b8045a95c2c9a8dff17c7f7ded36749d0228a00842ff01c5d64d731495"} Oct 08 22:28:39 crc kubenswrapper[4834]: I1008 22:28:39.507322 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npg7j" event={"ID":"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe","Type":"ContainerStarted","Data":"1a9afaabc6ba8c9dd19fb558b4878f38ed3347697e8a1307b5e9032abbdc118d"} Oct 08 22:28:39 crc kubenswrapper[4834]: I1008 22:28:39.510713 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg4bs" event={"ID":"3482aeed-66bd-4fe3-81d6-c12cdab7f9d9","Type":"ContainerStarted","Data":"8d0bb2cb6ba824443f267dfe36cfdf4a649c60ae71c0b6d2e023d6da47778b9e"} Oct 08 22:28:39 crc kubenswrapper[4834]: I1008 22:28:39.513909 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q6hqr" event={"ID":"057cb81c-17fd-4e22-8098-ea576e358559","Type":"ContainerStarted","Data":"6f847286a27353d6f3e1166cb503dab988b84389cc37fa6f1a74cbf84a04e082"} Oct 08 22:28:39 crc kubenswrapper[4834]: I1008 22:28:39.570698 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q6hqr" podStartSLOduration=1.8631177509999999 podStartE2EDuration="4.570669983s" podCreationTimestamp="2025-10-08 22:28:35 +0000 UTC" firstStartedPulling="2025-10-08 22:28:36.470239967 +0000 UTC m=+324.293124713" lastFinishedPulling="2025-10-08 22:28:39.177792199 +0000 UTC m=+327.000676945" observedRunningTime="2025-10-08 22:28:39.565909414 +0000 UTC m=+327.388794160" watchObservedRunningTime="2025-10-08 22:28:39.570669983 +0000 UTC m=+327.393554739" Oct 08 22:28:39 crc kubenswrapper[4834]: I1008 22:28:39.583068 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sg4bs" podStartSLOduration=1.94070831 podStartE2EDuration="4.583039898s" podCreationTimestamp="2025-10-08 22:28:35 +0000 UTC" firstStartedPulling="2025-10-08 22:28:36.474419077 +0000 UTC m=+324.297303823" lastFinishedPulling="2025-10-08 22:28:39.116750665 +0000 UTC m=+326.939635411" observedRunningTime="2025-10-08 22:28:39.581518977 +0000 UTC m=+327.404403723" watchObservedRunningTime="2025-10-08 22:28:39.583039898 +0000 UTC m=+327.405924644" Oct 08 22:28:40 crc kubenswrapper[4834]: I1008 22:28:40.521648 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvvps" event={"ID":"f2fd386d-6b62-4282-9d8b-7325d579e8cb","Type":"ContainerStarted","Data":"aefdb5c90f3a6eec7f6040838cab4df25921d0b772c9998e234f64154c5f64e3"} Oct 08 22:28:40 crc kubenswrapper[4834]: I1008 22:28:40.527791 4834 generic.go:334] "Generic (PLEG): container finished" podID="f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe" containerID="1a9afaabc6ba8c9dd19fb558b4878f38ed3347697e8a1307b5e9032abbdc118d" exitCode=0 Oct 08 22:28:40 crc kubenswrapper[4834]: I1008 22:28:40.527908 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npg7j" event={"ID":"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe","Type":"ContainerDied","Data":"1a9afaabc6ba8c9dd19fb558b4878f38ed3347697e8a1307b5e9032abbdc118d"} Oct 08 22:28:41 crc kubenswrapper[4834]: I1008 22:28:41.537856 4834 generic.go:334] "Generic (PLEG): container finished" podID="f2fd386d-6b62-4282-9d8b-7325d579e8cb" containerID="aefdb5c90f3a6eec7f6040838cab4df25921d0b772c9998e234f64154c5f64e3" exitCode=0 Oct 08 22:28:41 crc kubenswrapper[4834]: I1008 22:28:41.538292 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvvps" event={"ID":"f2fd386d-6b62-4282-9d8b-7325d579e8cb","Type":"ContainerDied","Data":"aefdb5c90f3a6eec7f6040838cab4df25921d0b772c9998e234f64154c5f64e3"} Oct 08 22:28:42 crc kubenswrapper[4834]: I1008 22:28:42.549097 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvvps" event={"ID":"f2fd386d-6b62-4282-9d8b-7325d579e8cb","Type":"ContainerStarted","Data":"21f4938e8553bcd030a955fd0ba3a00b2a68e7324f45442743e480ffe3fcad8c"} Oct 08 22:28:42 crc kubenswrapper[4834]: I1008 22:28:42.552441 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-npg7j" event={"ID":"f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe","Type":"ContainerStarted","Data":"02c29f47494e754564e1fd55b16b6cef0f8b6d5f099c5dc2c7221c5fd12c52ff"} Oct 08 22:28:42 crc kubenswrapper[4834]: I1008 22:28:42.570364 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wvvps" podStartSLOduration=3.138541578 podStartE2EDuration="5.570349362s" podCreationTimestamp="2025-10-08 22:28:37 +0000 UTC" firstStartedPulling="2025-10-08 22:28:39.505688062 +0000 UTC m=+327.328572808" lastFinishedPulling="2025-10-08 22:28:41.937495806 +0000 UTC m=+329.760380592" observedRunningTime="2025-10-08 22:28:42.567628548 +0000 UTC m=+330.390513294" watchObservedRunningTime="2025-10-08 22:28:42.570349362 +0000 UTC m=+330.393234108" Oct 08 22:28:45 crc kubenswrapper[4834]: I1008 22:28:45.672748 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:45 crc kubenswrapper[4834]: I1008 22:28:45.675645 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:45 crc kubenswrapper[4834]: I1008 22:28:45.739236 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:45 crc kubenswrapper[4834]: I1008 22:28:45.767691 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-npg7j" podStartSLOduration=6.292384499 podStartE2EDuration="8.767657067s" podCreationTimestamp="2025-10-08 22:28:37 +0000 UTC" firstStartedPulling="2025-10-08 22:28:38.499333752 +0000 UTC m=+326.322218498" lastFinishedPulling="2025-10-08 22:28:40.97460632 +0000 UTC m=+328.797491066" observedRunningTime="2025-10-08 22:28:42.594910698 +0000 UTC m=+330.417795444" watchObservedRunningTime="2025-10-08 22:28:45.767657067 +0000 UTC m=+333.590541853" Oct 08 22:28:45 crc kubenswrapper[4834]: I1008 22:28:45.840888 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:45 crc kubenswrapper[4834]: I1008 22:28:45.840948 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:45 crc kubenswrapper[4834]: I1008 22:28:45.895987 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:46 crc kubenswrapper[4834]: I1008 22:28:46.629722 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sg4bs" Oct 08 22:28:46 crc kubenswrapper[4834]: I1008 22:28:46.649667 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q6hqr" Oct 08 22:28:48 crc kubenswrapper[4834]: I1008 22:28:48.044878 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:48 crc kubenswrapper[4834]: I1008 22:28:48.045412 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:48 crc kubenswrapper[4834]: I1008 22:28:48.122473 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:48 crc kubenswrapper[4834]: I1008 22:28:48.264602 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:28:48 crc kubenswrapper[4834]: I1008 22:28:48.264694 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:28:48 crc kubenswrapper[4834]: I1008 22:28:48.335710 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:28:48 crc kubenswrapper[4834]: I1008 22:28:48.655092 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-npg7j" Oct 08 22:28:48 crc kubenswrapper[4834]: I1008 22:28:48.655564 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wvvps" Oct 08 22:29:17 crc kubenswrapper[4834]: I1008 22:29:17.025662 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:29:17 crc kubenswrapper[4834]: I1008 22:29:17.026879 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:29:47 crc kubenswrapper[4834]: I1008 22:29:47.025677 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:29:47 crc kubenswrapper[4834]: I1008 22:29:47.026677 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.200811 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn"] Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.202956 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.206387 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.207796 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.215077 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn"] Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.273725 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btr4f\" (UniqueName: \"kubernetes.io/projected/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-kube-api-access-btr4f\") pod \"collect-profiles-29332710-sj7zn\" (UID: \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.273837 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-config-volume\") pod \"collect-profiles-29332710-sj7zn\" (UID: \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.273868 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-secret-volume\") pod \"collect-profiles-29332710-sj7zn\" (UID: \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.375540 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-config-volume\") pod \"collect-profiles-29332710-sj7zn\" (UID: \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.375632 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-secret-volume\") pod \"collect-profiles-29332710-sj7zn\" (UID: \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.375768 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btr4f\" (UniqueName: \"kubernetes.io/projected/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-kube-api-access-btr4f\") pod \"collect-profiles-29332710-sj7zn\" (UID: \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.376794 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-config-volume\") pod \"collect-profiles-29332710-sj7zn\" (UID: \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.384571 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-secret-volume\") pod \"collect-profiles-29332710-sj7zn\" (UID: \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.400122 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btr4f\" (UniqueName: \"kubernetes.io/projected/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-kube-api-access-btr4f\") pod \"collect-profiles-29332710-sj7zn\" (UID: \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.528490 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" Oct 08 22:30:00 crc kubenswrapper[4834]: I1008 22:30:00.803928 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn"] Oct 08 22:30:01 crc kubenswrapper[4834]: I1008 22:30:01.124604 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" event={"ID":"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1","Type":"ContainerStarted","Data":"b7e3ceae00cd0572166ae276fba98515dfa3144d20968b111ea7d37a1228fb05"} Oct 08 22:30:01 crc kubenswrapper[4834]: I1008 22:30:01.125102 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" event={"ID":"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1","Type":"ContainerStarted","Data":"65cd4ab2685d96486e2adac0f5e130b53c03d317142d221c5acf5be7125d4346"} Oct 08 22:30:01 crc kubenswrapper[4834]: I1008 22:30:01.154368 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" podStartSLOduration=1.154331949 podStartE2EDuration="1.154331949s" podCreationTimestamp="2025-10-08 22:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:30:01.150363753 +0000 UTC m=+408.973248539" watchObservedRunningTime="2025-10-08 22:30:01.154331949 +0000 UTC m=+408.977216725" Oct 08 22:30:02 crc kubenswrapper[4834]: I1008 22:30:02.136461 4834 generic.go:334] "Generic (PLEG): container finished" podID="b80f2a3c-d45a-4888-9ac1-39f5ee2abea1" containerID="b7e3ceae00cd0572166ae276fba98515dfa3144d20968b111ea7d37a1228fb05" exitCode=0 Oct 08 22:30:02 crc kubenswrapper[4834]: I1008 22:30:02.136560 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" event={"ID":"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1","Type":"ContainerDied","Data":"b7e3ceae00cd0572166ae276fba98515dfa3144d20968b111ea7d37a1228fb05"} Oct 08 22:30:03 crc kubenswrapper[4834]: I1008 22:30:03.472851 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" Oct 08 22:30:03 crc kubenswrapper[4834]: I1008 22:30:03.531092 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-secret-volume\") pod \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\" (UID: \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\") " Oct 08 22:30:03 crc kubenswrapper[4834]: I1008 22:30:03.531178 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btr4f\" (UniqueName: \"kubernetes.io/projected/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-kube-api-access-btr4f\") pod \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\" (UID: \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\") " Oct 08 22:30:03 crc kubenswrapper[4834]: I1008 22:30:03.531441 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-config-volume\") pod \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\" (UID: \"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1\") " Oct 08 22:30:03 crc kubenswrapper[4834]: I1008 22:30:03.532195 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-config-volume" (OuterVolumeSpecName: "config-volume") pod "b80f2a3c-d45a-4888-9ac1-39f5ee2abea1" (UID: "b80f2a3c-d45a-4888-9ac1-39f5ee2abea1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:30:03 crc kubenswrapper[4834]: I1008 22:30:03.539361 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b80f2a3c-d45a-4888-9ac1-39f5ee2abea1" (UID: "b80f2a3c-d45a-4888-9ac1-39f5ee2abea1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:30:03 crc kubenswrapper[4834]: I1008 22:30:03.540050 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-kube-api-access-btr4f" (OuterVolumeSpecName: "kube-api-access-btr4f") pod "b80f2a3c-d45a-4888-9ac1-39f5ee2abea1" (UID: "b80f2a3c-d45a-4888-9ac1-39f5ee2abea1"). InnerVolumeSpecName "kube-api-access-btr4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:30:03 crc kubenswrapper[4834]: I1008 22:30:03.632662 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:30:03 crc kubenswrapper[4834]: I1008 22:30:03.632713 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btr4f\" (UniqueName: \"kubernetes.io/projected/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-kube-api-access-btr4f\") on node \"crc\" DevicePath \"\"" Oct 08 22:30:03 crc kubenswrapper[4834]: I1008 22:30:03.632741 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:30:04 crc kubenswrapper[4834]: I1008 22:30:04.151397 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" event={"ID":"b80f2a3c-d45a-4888-9ac1-39f5ee2abea1","Type":"ContainerDied","Data":"65cd4ab2685d96486e2adac0f5e130b53c03d317142d221c5acf5be7125d4346"} Oct 08 22:30:04 crc kubenswrapper[4834]: I1008 22:30:04.151481 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65cd4ab2685d96486e2adac0f5e130b53c03d317142d221c5acf5be7125d4346" Oct 08 22:30:04 crc kubenswrapper[4834]: I1008 22:30:04.151491 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn" Oct 08 22:30:17 crc kubenswrapper[4834]: I1008 22:30:17.028876 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:30:17 crc kubenswrapper[4834]: I1008 22:30:17.029874 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:30:17 crc kubenswrapper[4834]: I1008 22:30:17.029952 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:30:17 crc kubenswrapper[4834]: I1008 22:30:17.030921 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"719bf800125f02404a6274ba54e261065ace13a170b73466f709f454b2e76f47"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:30:17 crc kubenswrapper[4834]: I1008 22:30:17.031015 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://719bf800125f02404a6274ba54e261065ace13a170b73466f709f454b2e76f47" gracePeriod=600 Oct 08 22:30:17 crc kubenswrapper[4834]: I1008 22:30:17.247827 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="719bf800125f02404a6274ba54e261065ace13a170b73466f709f454b2e76f47" exitCode=0 Oct 08 22:30:17 crc kubenswrapper[4834]: I1008 22:30:17.247910 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"719bf800125f02404a6274ba54e261065ace13a170b73466f709f454b2e76f47"} Oct 08 22:30:17 crc kubenswrapper[4834]: I1008 22:30:17.247986 4834 scope.go:117] "RemoveContainer" containerID="3f6d7a1f79a77e950556eb15fb0dd90bcbcb46c3e5539902432ef139ec3be784" Oct 08 22:30:18 crc kubenswrapper[4834]: I1008 22:30:18.256382 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"b487a4d9d655435511ba5dee397c544c606f8e8cfe6bd53b6020e82a7748e90e"} Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.554180 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2kjgs"] Oct 08 22:30:20 crc kubenswrapper[4834]: E1008 22:30:20.555097 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b80f2a3c-d45a-4888-9ac1-39f5ee2abea1" containerName="collect-profiles" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.555123 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b80f2a3c-d45a-4888-9ac1-39f5ee2abea1" containerName="collect-profiles" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.555388 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b80f2a3c-d45a-4888-9ac1-39f5ee2abea1" containerName="collect-profiles" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.556460 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.569230 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2kjgs"] Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.716552 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e3caac5-e230-49eb-bf29-1716a6fe94aa-trusted-ca\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.716623 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92cf4\" (UniqueName: \"kubernetes.io/projected/4e3caac5-e230-49eb-bf29-1716a6fe94aa-kube-api-access-92cf4\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.716659 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e3caac5-e230-49eb-bf29-1716a6fe94aa-registry-tls\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.717225 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e3caac5-e230-49eb-bf29-1716a6fe94aa-bound-sa-token\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.717400 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e3caac5-e230-49eb-bf29-1716a6fe94aa-registry-certificates\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.717503 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.717627 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e3caac5-e230-49eb-bf29-1716a6fe94aa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.717862 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e3caac5-e230-49eb-bf29-1716a6fe94aa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.752836 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.819521 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e3caac5-e230-49eb-bf29-1716a6fe94aa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.819652 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e3caac5-e230-49eb-bf29-1716a6fe94aa-trusted-ca\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.819692 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92cf4\" (UniqueName: \"kubernetes.io/projected/4e3caac5-e230-49eb-bf29-1716a6fe94aa-kube-api-access-92cf4\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.819761 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e3caac5-e230-49eb-bf29-1716a6fe94aa-registry-tls\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.819922 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e3caac5-e230-49eb-bf29-1716a6fe94aa-bound-sa-token\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.820601 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e3caac5-e230-49eb-bf29-1716a6fe94aa-registry-certificates\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.821292 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e3caac5-e230-49eb-bf29-1716a6fe94aa-trusted-ca\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.822294 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e3caac5-e230-49eb-bf29-1716a6fe94aa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.822820 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e3caac5-e230-49eb-bf29-1716a6fe94aa-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.823549 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e3caac5-e230-49eb-bf29-1716a6fe94aa-registry-certificates\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.827932 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e3caac5-e230-49eb-bf29-1716a6fe94aa-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.829208 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e3caac5-e230-49eb-bf29-1716a6fe94aa-registry-tls\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.840012 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92cf4\" (UniqueName: \"kubernetes.io/projected/4e3caac5-e230-49eb-bf29-1716a6fe94aa-kube-api-access-92cf4\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.850371 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e3caac5-e230-49eb-bf29-1716a6fe94aa-bound-sa-token\") pod \"image-registry-66df7c8f76-2kjgs\" (UID: \"4e3caac5-e230-49eb-bf29-1716a6fe94aa\") " pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:20 crc kubenswrapper[4834]: I1008 22:30:20.873667 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:21 crc kubenswrapper[4834]: I1008 22:30:21.167891 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2kjgs"] Oct 08 22:30:21 crc kubenswrapper[4834]: I1008 22:30:21.280476 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" event={"ID":"4e3caac5-e230-49eb-bf29-1716a6fe94aa","Type":"ContainerStarted","Data":"043ff2782360c40ec37629aec64e7e011e3668f86a6ac8ead21991741e79992b"} Oct 08 22:30:22 crc kubenswrapper[4834]: I1008 22:30:22.292308 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" event={"ID":"4e3caac5-e230-49eb-bf29-1716a6fe94aa","Type":"ContainerStarted","Data":"558717ef9757e2f5e2f6a5df66c6b422b014cdae5754648ea5320053e66a23c7"} Oct 08 22:30:22 crc kubenswrapper[4834]: I1008 22:30:22.292866 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:22 crc kubenswrapper[4834]: I1008 22:30:22.329990 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" podStartSLOduration=2.32995398 podStartE2EDuration="2.32995398s" podCreationTimestamp="2025-10-08 22:30:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:30:22.321810584 +0000 UTC m=+430.144695370" watchObservedRunningTime="2025-10-08 22:30:22.32995398 +0000 UTC m=+430.152838756" Oct 08 22:30:40 crc kubenswrapper[4834]: I1008 22:30:40.879282 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-2kjgs" Oct 08 22:30:40 crc kubenswrapper[4834]: I1008 22:30:40.964408 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9n7fb"] Oct 08 22:31:06 crc kubenswrapper[4834]: I1008 22:31:06.481072 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" podUID="a975074f-5780-405c-bf73-36ebcaf7bb06" containerName="registry" containerID="cri-o://82239ce17c538ed3be5f479870256ca57d482fe205e55d41489a809824bf45b8" gracePeriod=30 Oct 08 22:31:06 crc kubenswrapper[4834]: I1008 22:31:06.639785 4834 generic.go:334] "Generic (PLEG): container finished" podID="a975074f-5780-405c-bf73-36ebcaf7bb06" containerID="82239ce17c538ed3be5f479870256ca57d482fe205e55d41489a809824bf45b8" exitCode=0 Oct 08 22:31:06 crc kubenswrapper[4834]: I1008 22:31:06.639845 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" event={"ID":"a975074f-5780-405c-bf73-36ebcaf7bb06","Type":"ContainerDied","Data":"82239ce17c538ed3be5f479870256ca57d482fe205e55d41489a809824bf45b8"} Oct 08 22:31:06 crc kubenswrapper[4834]: I1008 22:31:06.953211 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.063317 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a975074f-5780-405c-bf73-36ebcaf7bb06-installation-pull-secrets\") pod \"a975074f-5780-405c-bf73-36ebcaf7bb06\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.063441 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-registry-tls\") pod \"a975074f-5780-405c-bf73-36ebcaf7bb06\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.063906 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a975074f-5780-405c-bf73-36ebcaf7bb06\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.063967 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2w6v\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-kube-api-access-p2w6v\") pod \"a975074f-5780-405c-bf73-36ebcaf7bb06\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.064000 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-bound-sa-token\") pod \"a975074f-5780-405c-bf73-36ebcaf7bb06\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.064046 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a975074f-5780-405c-bf73-36ebcaf7bb06-trusted-ca\") pod \"a975074f-5780-405c-bf73-36ebcaf7bb06\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.064092 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a975074f-5780-405c-bf73-36ebcaf7bb06-ca-trust-extracted\") pod \"a975074f-5780-405c-bf73-36ebcaf7bb06\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.064135 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a975074f-5780-405c-bf73-36ebcaf7bb06-registry-certificates\") pod \"a975074f-5780-405c-bf73-36ebcaf7bb06\" (UID: \"a975074f-5780-405c-bf73-36ebcaf7bb06\") " Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.065259 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a975074f-5780-405c-bf73-36ebcaf7bb06-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a975074f-5780-405c-bf73-36ebcaf7bb06" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.065690 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a975074f-5780-405c-bf73-36ebcaf7bb06-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a975074f-5780-405c-bf73-36ebcaf7bb06" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.073610 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a975074f-5780-405c-bf73-36ebcaf7bb06" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.075641 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a975074f-5780-405c-bf73-36ebcaf7bb06-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.075683 4834 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a975074f-5780-405c-bf73-36ebcaf7bb06-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.075701 4834 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.075731 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-kube-api-access-p2w6v" (OuterVolumeSpecName: "kube-api-access-p2w6v") pod "a975074f-5780-405c-bf73-36ebcaf7bb06" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06"). InnerVolumeSpecName "kube-api-access-p2w6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.076838 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a975074f-5780-405c-bf73-36ebcaf7bb06-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a975074f-5780-405c-bf73-36ebcaf7bb06" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.076937 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a975074f-5780-405c-bf73-36ebcaf7bb06" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.080388 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a975074f-5780-405c-bf73-36ebcaf7bb06" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.099491 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a975074f-5780-405c-bf73-36ebcaf7bb06-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a975074f-5780-405c-bf73-36ebcaf7bb06" (UID: "a975074f-5780-405c-bf73-36ebcaf7bb06"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.177668 4834 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a975074f-5780-405c-bf73-36ebcaf7bb06-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.177747 4834 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a975074f-5780-405c-bf73-36ebcaf7bb06-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.177774 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2w6v\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-kube-api-access-p2w6v\") on node \"crc\" DevicePath \"\"" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.177797 4834 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a975074f-5780-405c-bf73-36ebcaf7bb06-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.650652 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" event={"ID":"a975074f-5780-405c-bf73-36ebcaf7bb06","Type":"ContainerDied","Data":"39750ffc8ecd8b545a37202e8fc325e20b685a90b4bc211e7e35fe66f213db32"} Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.650743 4834 scope.go:117] "RemoveContainer" containerID="82239ce17c538ed3be5f479870256ca57d482fe205e55d41489a809824bf45b8" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.650786 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9n7fb" Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.689038 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9n7fb"] Oct 08 22:31:07 crc kubenswrapper[4834]: I1008 22:31:07.696108 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9n7fb"] Oct 08 22:31:09 crc kubenswrapper[4834]: I1008 22:31:09.569414 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a975074f-5780-405c-bf73-36ebcaf7bb06" path="/var/lib/kubelet/pods/a975074f-5780-405c-bf73-36ebcaf7bb06/volumes" Oct 08 22:32:17 crc kubenswrapper[4834]: I1008 22:32:17.025778 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:32:17 crc kubenswrapper[4834]: I1008 22:32:17.026808 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:32:47 crc kubenswrapper[4834]: I1008 22:32:47.026374 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:32:47 crc kubenswrapper[4834]: I1008 22:32:47.027391 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:33:17 crc kubenswrapper[4834]: I1008 22:33:17.025764 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:33:17 crc kubenswrapper[4834]: I1008 22:33:17.026715 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:33:17 crc kubenswrapper[4834]: I1008 22:33:17.026789 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:33:17 crc kubenswrapper[4834]: I1008 22:33:17.027642 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b487a4d9d655435511ba5dee397c544c606f8e8cfe6bd53b6020e82a7748e90e"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:33:17 crc kubenswrapper[4834]: I1008 22:33:17.027791 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://b487a4d9d655435511ba5dee397c544c606f8e8cfe6bd53b6020e82a7748e90e" gracePeriod=600 Oct 08 22:33:17 crc kubenswrapper[4834]: I1008 22:33:17.658658 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="b487a4d9d655435511ba5dee397c544c606f8e8cfe6bd53b6020e82a7748e90e" exitCode=0 Oct 08 22:33:17 crc kubenswrapper[4834]: I1008 22:33:17.658738 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"b487a4d9d655435511ba5dee397c544c606f8e8cfe6bd53b6020e82a7748e90e"} Oct 08 22:33:17 crc kubenswrapper[4834]: I1008 22:33:17.659131 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"52a93a6fe63650f6a03b209142df1e3ce01f805d96d6142f73c9c419354c0aca"} Oct 08 22:33:17 crc kubenswrapper[4834]: I1008 22:33:17.659198 4834 scope.go:117] "RemoveContainer" containerID="719bf800125f02404a6274ba54e261065ace13a170b73466f709f454b2e76f47" Oct 08 22:35:17 crc kubenswrapper[4834]: I1008 22:35:17.025688 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:35:17 crc kubenswrapper[4834]: I1008 22:35:17.026908 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.241133 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wrrs9"] Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.242294 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovn-controller" containerID="cri-o://5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc" gracePeriod=30 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.242401 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="nbdb" containerID="cri-o://c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf" gracePeriod=30 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.242491 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="northd" containerID="cri-o://1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6" gracePeriod=30 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.242573 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385" gracePeriod=30 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.242597 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="sbdb" containerID="cri-o://2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236" gracePeriod=30 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.242645 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovn-acl-logging" containerID="cri-o://129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec" gracePeriod=30 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.242741 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="kube-rbac-proxy-node" containerID="cri-o://9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b" gracePeriod=30 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.325351 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" containerID="cri-o://e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b" gracePeriod=30 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.611367 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f297z_b150123b-551e-4c12-afa1-0c651719d3f2/kube-multus/2.log" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.612674 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f297z_b150123b-551e-4c12-afa1-0c651719d3f2/kube-multus/1.log" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.612715 4834 generic.go:334] "Generic (PLEG): container finished" podID="b150123b-551e-4c12-afa1-0c651719d3f2" containerID="e8ed8e2d5d9a78bb58ff2b75a8232ded894aa521707893d1b0e1ffa1dc3c4cb0" exitCode=2 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.612791 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f297z" event={"ID":"b150123b-551e-4c12-afa1-0c651719d3f2","Type":"ContainerDied","Data":"e8ed8e2d5d9a78bb58ff2b75a8232ded894aa521707893d1b0e1ffa1dc3c4cb0"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.612831 4834 scope.go:117] "RemoveContainer" containerID="fd1e83915e532b2e5b950d711cd3cec1216f3d984e9d620f71d233e95982792d" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.612829 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/3.log" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.613318 4834 scope.go:117] "RemoveContainer" containerID="e8ed8e2d5d9a78bb58ff2b75a8232ded894aa521707893d1b0e1ffa1dc3c4cb0" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.613496 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-f297z_openshift-multus(b150123b-551e-4c12-afa1-0c651719d3f2)\"" pod="openshift-multus/multus-f297z" podUID="b150123b-551e-4c12-afa1-0c651719d3f2" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.615217 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovnkube-controller/3.log" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.617425 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovn-acl-logging/0.log" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.618103 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovn-controller/0.log" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.618854 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.620956 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovn-acl-logging/0.log" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.621468 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrrs9_f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/ovn-controller/0.log" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622102 4834 generic.go:334] "Generic (PLEG): container finished" podID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerID="e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b" exitCode=0 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622125 4834 generic.go:334] "Generic (PLEG): container finished" podID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerID="2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236" exitCode=0 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622134 4834 generic.go:334] "Generic (PLEG): container finished" podID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerID="c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf" exitCode=0 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622155 4834 generic.go:334] "Generic (PLEG): container finished" podID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerID="1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6" exitCode=0 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622162 4834 generic.go:334] "Generic (PLEG): container finished" podID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerID="1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385" exitCode=0 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622168 4834 generic.go:334] "Generic (PLEG): container finished" podID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerID="9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b" exitCode=0 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622176 4834 generic.go:334] "Generic (PLEG): container finished" podID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerID="129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec" exitCode=143 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622183 4834 generic.go:334] "Generic (PLEG): container finished" podID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerID="5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc" exitCode=143 Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622205 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerDied","Data":"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622236 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerDied","Data":"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622249 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerDied","Data":"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622262 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerDied","Data":"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622273 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerDied","Data":"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622282 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerDied","Data":"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622292 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622303 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622309 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622314 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622320 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622325 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622331 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622337 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622343 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622350 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622358 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerDied","Data":"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622367 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622374 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622381 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622387 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622393 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622399 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622405 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622411 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622418 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622424 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622431 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerDied","Data":"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622439 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622447 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622452 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622458 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622464 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622469 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622475 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622479 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622484 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622490 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622498 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" event={"ID":"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc","Type":"ContainerDied","Data":"2a42fb3fdc38aeee5025bdc655cb04ffcb33ee59bf21601262bc514d75501646"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622505 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622511 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622516 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622522 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622527 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622532 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622537 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622542 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622548 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.622553 4834 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1"} Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.638868 4834 scope.go:117] "RemoveContainer" containerID="e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.662314 4834 scope.go:117] "RemoveContainer" containerID="0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.685756 4834 scope.go:117] "RemoveContainer" containerID="2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703459 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zx6gd"] Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.703663 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703683 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.703693 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovn-acl-logging" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703700 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovn-acl-logging" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.703710 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703716 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.703725 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a975074f-5780-405c-bf73-36ebcaf7bb06" containerName="registry" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703731 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a975074f-5780-405c-bf73-36ebcaf7bb06" containerName="registry" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.703739 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="kubecfg-setup" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703744 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="kubecfg-setup" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.703753 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703759 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.703768 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="northd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703772 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="northd" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.703781 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="nbdb" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703788 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="nbdb" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.703796 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovn-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703803 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovn-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.703811 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="sbdb" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703817 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="sbdb" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.703823 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="kube-rbac-proxy-node" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703830 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="kube-rbac-proxy-node" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.703837 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703843 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703921 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="kube-rbac-proxy-node" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703929 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="sbdb" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703939 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="northd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703946 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovn-acl-logging" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703955 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703962 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="kube-rbac-proxy-ovn-metrics" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703970 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="nbdb" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703978 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703986 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovn-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.703995 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a975074f-5780-405c-bf73-36ebcaf7bb06" containerName="registry" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.704002 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.704112 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.704267 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.704344 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.704428 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.704435 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.704510 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" containerName="ovnkube-controller" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.704245 4834 scope.go:117] "RemoveContainer" containerID="c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.705839 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.719446 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-node-log\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.719557 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-cni-bin\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.719558 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-node-log" (OuterVolumeSpecName: "node-log") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.719611 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-kubelet\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.719669 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-run-netns\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.719695 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.719708 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw97q\" (UniqueName: \"kubernetes.io/projected/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-kube-api-access-fw97q\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.719751 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-etc-openvswitch\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.719776 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovnkube-script-lib\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.719810 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-systemd\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.719750 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.719814 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.719870 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.720288 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.719839 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-var-lib-openvswitch\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721432 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-systemd-units\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721490 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721522 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-slash\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721538 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721543 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-ovn\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721567 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721587 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-openvswitch\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721598 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721611 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-run-ovn-kubernetes\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721622 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-slash" (OuterVolumeSpecName: "host-slash") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721638 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovnkube-config\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721647 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721654 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-cni-netd\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721671 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-log-socket\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721694 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-env-overrides\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721718 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovn-node-metrics-cert\") pod \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\" (UID: \"f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc\") " Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722220 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-run-ovn-kubernetes\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.721670 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722195 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-log-socket" (OuterVolumeSpecName: "log-socket") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722217 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722240 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722253 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/010c0911-c97b-4712-9b06-1cdd7a88d823-ovnkube-script-lib\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722314 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-node-log\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722337 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722326 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722357 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/010c0911-c97b-4712-9b06-1cdd7a88d823-env-overrides\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722543 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-kubelet\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722596 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-run-openvswitch\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722628 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-run-ovn\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722650 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-cni-bin\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722681 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-log-socket\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.722649 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723020 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-var-lib-openvswitch\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723062 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/010c0911-c97b-4712-9b06-1cdd7a88d823-ovnkube-config\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723089 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-slash\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723118 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-run-systemd\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723171 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2qnn\" (UniqueName: \"kubernetes.io/projected/010c0911-c97b-4712-9b06-1cdd7a88d823-kube-api-access-v2qnn\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723209 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-systemd-units\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723329 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-cni-netd\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723508 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-etc-openvswitch\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723564 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-run-netns\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723622 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/010c0911-c97b-4712-9b06-1cdd7a88d823-ovn-node-metrics-cert\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723757 4834 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723790 4834 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723813 4834 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723835 4834 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723858 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723886 4834 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723912 4834 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.723985 4834 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.724021 4834 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-slash\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.724044 4834 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.724064 4834 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.724088 4834 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.724109 4834 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.724130 4834 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.724185 4834 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-log-socket\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.724207 4834 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.724229 4834 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-node-log\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.734017 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-kube-api-access-fw97q" (OuterVolumeSpecName: "kube-api-access-fw97q") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "kube-api-access-fw97q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.735649 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.741834 4834 scope.go:117] "RemoveContainer" containerID="1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.752631 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" (UID: "f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.757694 4834 scope.go:117] "RemoveContainer" containerID="1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.770577 4834 scope.go:117] "RemoveContainer" containerID="9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.784685 4834 scope.go:117] "RemoveContainer" containerID="129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.797135 4834 scope.go:117] "RemoveContainer" containerID="5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.809113 4834 scope.go:117] "RemoveContainer" containerID="f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.823097 4834 scope.go:117] "RemoveContainer" containerID="e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.823592 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b\": container with ID starting with e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b not found: ID does not exist" containerID="e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.823624 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b"} err="failed to get container status \"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b\": rpc error: code = NotFound desc = could not find container \"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b\": container with ID starting with e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.823645 4834 scope.go:117] "RemoveContainer" containerID="0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.823985 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946\": container with ID starting with 0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946 not found: ID does not exist" containerID="0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824035 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946"} err="failed to get container status \"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946\": rpc error: code = NotFound desc = could not find container \"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946\": container with ID starting with 0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824067 4834 scope.go:117] "RemoveContainer" containerID="2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.824338 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\": container with ID starting with 2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236 not found: ID does not exist" containerID="2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824364 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236"} err="failed to get container status \"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\": rpc error: code = NotFound desc = could not find container \"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\": container with ID starting with 2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824379 4834 scope.go:117] "RemoveContainer" containerID="c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.824627 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\": container with ID starting with c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf not found: ID does not exist" containerID="c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824648 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf"} err="failed to get container status \"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\": rpc error: code = NotFound desc = could not find container \"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\": container with ID starting with c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824660 4834 scope.go:117] "RemoveContainer" containerID="1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824677 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-etc-openvswitch\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824715 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/010c0911-c97b-4712-9b06-1cdd7a88d823-ovn-node-metrics-cert\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824745 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-run-netns\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824785 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-run-ovn-kubernetes\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824804 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-etc-openvswitch\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824819 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/010c0911-c97b-4712-9b06-1cdd7a88d823-ovnkube-script-lib\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824878 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824906 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-node-log\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824926 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/010c0911-c97b-4712-9b06-1cdd7a88d823-env-overrides\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.824958 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-kubelet\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825001 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-run-openvswitch\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825040 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-run-ovn\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825069 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-cni-bin\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825098 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-log-socket\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825123 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-var-lib-openvswitch\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825170 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/010c0911-c97b-4712-9b06-1cdd7a88d823-ovnkube-config\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825191 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-slash\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825212 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-run-systemd\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825256 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2qnn\" (UniqueName: \"kubernetes.io/projected/010c0911-c97b-4712-9b06-1cdd7a88d823-kube-api-access-v2qnn\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825288 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-systemd-units\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825309 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-cni-netd\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825367 4834 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825381 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825395 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw97q\" (UniqueName: \"kubernetes.io/projected/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc-kube-api-access-fw97q\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825423 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-cni-netd\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825445 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825468 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-node-log\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825664 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-run-netns\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825710 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-run-ovn-kubernetes\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825730 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-var-lib-openvswitch\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825740 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-kubelet\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825764 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-cni-bin\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825770 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-log-socket\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.825812 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\": container with ID starting with 1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6 not found: ID does not exist" containerID="1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825834 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-run-ovn\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825860 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6"} err="failed to get container status \"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\": rpc error: code = NotFound desc = could not find container \"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\": container with ID starting with 1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825874 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-systemd-units\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825889 4834 scope.go:117] "RemoveContainer" containerID="1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825912 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-run-openvswitch\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825932 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-host-slash\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.825941 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/010c0911-c97b-4712-9b06-1cdd7a88d823-ovnkube-script-lib\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.826037 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/010c0911-c97b-4712-9b06-1cdd7a88d823-run-systemd\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.826385 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/010c0911-c97b-4712-9b06-1cdd7a88d823-ovnkube-config\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.826478 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\": container with ID starting with 1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385 not found: ID does not exist" containerID="1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.826503 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385"} err="failed to get container status \"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\": rpc error: code = NotFound desc = could not find container \"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\": container with ID starting with 1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.826517 4834 scope.go:117] "RemoveContainer" containerID="9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.826747 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\": container with ID starting with 9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b not found: ID does not exist" containerID="9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.826771 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b"} err="failed to get container status \"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\": rpc error: code = NotFound desc = could not find container \"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\": container with ID starting with 9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.826785 4834 scope.go:117] "RemoveContainer" containerID="129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.826964 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\": container with ID starting with 129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec not found: ID does not exist" containerID="129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.826988 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec"} err="failed to get container status \"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\": rpc error: code = NotFound desc = could not find container \"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\": container with ID starting with 129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.827003 4834 scope.go:117] "RemoveContainer" containerID="5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.827333 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\": container with ID starting with 5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc not found: ID does not exist" containerID="5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.827353 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc"} err="failed to get container status \"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\": rpc error: code = NotFound desc = could not find container \"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\": container with ID starting with 5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.827366 4834 scope.go:117] "RemoveContainer" containerID="f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.827415 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/010c0911-c97b-4712-9b06-1cdd7a88d823-env-overrides\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: E1008 22:35:20.827591 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\": container with ID starting with f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1 not found: ID does not exist" containerID="f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.827610 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1"} err="failed to get container status \"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\": rpc error: code = NotFound desc = could not find container \"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\": container with ID starting with f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.827621 4834 scope.go:117] "RemoveContainer" containerID="e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.827820 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b"} err="failed to get container status \"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b\": rpc error: code = NotFound desc = could not find container \"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b\": container with ID starting with e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.827838 4834 scope.go:117] "RemoveContainer" containerID="0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.828022 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946"} err="failed to get container status \"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946\": rpc error: code = NotFound desc = could not find container \"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946\": container with ID starting with 0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.828040 4834 scope.go:117] "RemoveContainer" containerID="2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.828226 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236"} err="failed to get container status \"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\": rpc error: code = NotFound desc = could not find container \"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\": container with ID starting with 2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.828244 4834 scope.go:117] "RemoveContainer" containerID="c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.828385 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf"} err="failed to get container status \"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\": rpc error: code = NotFound desc = could not find container \"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\": container with ID starting with c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.828405 4834 scope.go:117] "RemoveContainer" containerID="1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.828619 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6"} err="failed to get container status \"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\": rpc error: code = NotFound desc = could not find container \"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\": container with ID starting with 1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.828636 4834 scope.go:117] "RemoveContainer" containerID="1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.828872 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385"} err="failed to get container status \"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\": rpc error: code = NotFound desc = could not find container \"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\": container with ID starting with 1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.828890 4834 scope.go:117] "RemoveContainer" containerID="9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.829070 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b"} err="failed to get container status \"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\": rpc error: code = NotFound desc = could not find container \"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\": container with ID starting with 9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.829085 4834 scope.go:117] "RemoveContainer" containerID="129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.829261 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec"} err="failed to get container status \"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\": rpc error: code = NotFound desc = could not find container \"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\": container with ID starting with 129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.829278 4834 scope.go:117] "RemoveContainer" containerID="5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.829440 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc"} err="failed to get container status \"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\": rpc error: code = NotFound desc = could not find container \"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\": container with ID starting with 5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.829459 4834 scope.go:117] "RemoveContainer" containerID="f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.829620 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1"} err="failed to get container status \"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\": rpc error: code = NotFound desc = could not find container \"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\": container with ID starting with f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.829636 4834 scope.go:117] "RemoveContainer" containerID="e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.829814 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b"} err="failed to get container status \"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b\": rpc error: code = NotFound desc = could not find container \"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b\": container with ID starting with e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.829831 4834 scope.go:117] "RemoveContainer" containerID="0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.830004 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946"} err="failed to get container status \"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946\": rpc error: code = NotFound desc = could not find container \"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946\": container with ID starting with 0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.830020 4834 scope.go:117] "RemoveContainer" containerID="2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.830197 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236"} err="failed to get container status \"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\": rpc error: code = NotFound desc = could not find container \"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\": container with ID starting with 2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.830213 4834 scope.go:117] "RemoveContainer" containerID="c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.830443 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf"} err="failed to get container status \"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\": rpc error: code = NotFound desc = could not find container \"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\": container with ID starting with c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.830464 4834 scope.go:117] "RemoveContainer" containerID="1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.830680 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6"} err="failed to get container status \"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\": rpc error: code = NotFound desc = could not find container \"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\": container with ID starting with 1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.830697 4834 scope.go:117] "RemoveContainer" containerID="1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.830870 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385"} err="failed to get container status \"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\": rpc error: code = NotFound desc = could not find container \"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\": container with ID starting with 1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.830888 4834 scope.go:117] "RemoveContainer" containerID="9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.831042 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b"} err="failed to get container status \"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\": rpc error: code = NotFound desc = could not find container \"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\": container with ID starting with 9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.831073 4834 scope.go:117] "RemoveContainer" containerID="129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.831421 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec"} err="failed to get container status \"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\": rpc error: code = NotFound desc = could not find container \"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\": container with ID starting with 129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.831442 4834 scope.go:117] "RemoveContainer" containerID="5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.831712 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc"} err="failed to get container status \"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\": rpc error: code = NotFound desc = could not find container \"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\": container with ID starting with 5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.831733 4834 scope.go:117] "RemoveContainer" containerID="f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.831920 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1"} err="failed to get container status \"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\": rpc error: code = NotFound desc = could not find container \"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\": container with ID starting with f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.831938 4834 scope.go:117] "RemoveContainer" containerID="e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.832180 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b"} err="failed to get container status \"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b\": rpc error: code = NotFound desc = could not find container \"e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b\": container with ID starting with e4b252b78ffd0fc6e0efeb618b2d31a6b1cbc0505215d59ea58a461114eb7e9b not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.832197 4834 scope.go:117] "RemoveContainer" containerID="0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.832397 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946"} err="failed to get container status \"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946\": rpc error: code = NotFound desc = could not find container \"0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946\": container with ID starting with 0aa99840075f5828b1fea5154a7356a712b9c6bc786f4124f70a54e7332d0946 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.832429 4834 scope.go:117] "RemoveContainer" containerID="2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.832589 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236"} err="failed to get container status \"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\": rpc error: code = NotFound desc = could not find container \"2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236\": container with ID starting with 2b30c92a0528d1d24e2fc109a1dec7da2331fd3efb3973d84eb020f7b1d56236 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.832607 4834 scope.go:117] "RemoveContainer" containerID="c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.832738 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/010c0911-c97b-4712-9b06-1cdd7a88d823-ovn-node-metrics-cert\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.832748 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf"} err="failed to get container status \"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\": rpc error: code = NotFound desc = could not find container \"c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf\": container with ID starting with c545f12c96ef1f3dc4cd2a810e166f662266928d0ecd1278572ee4a9913a6ecf not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.832793 4834 scope.go:117] "RemoveContainer" containerID="1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.833038 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6"} err="failed to get container status \"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\": rpc error: code = NotFound desc = could not find container \"1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6\": container with ID starting with 1f28a2c1482a3972f1d9befcb2fe309a02df1a0f8e0527f75623b2fef6285cc6 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.833058 4834 scope.go:117] "RemoveContainer" containerID="1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.833276 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385"} err="failed to get container status \"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\": rpc error: code = NotFound desc = could not find container \"1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385\": container with ID starting with 1c6617170c15ced20e6174f7a53c7ea562c5f15f8f87c6239bdc0bfa52b29385 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.833292 4834 scope.go:117] "RemoveContainer" containerID="9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.833618 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b"} err="failed to get container status \"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\": rpc error: code = NotFound desc = could not find container \"9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b\": container with ID starting with 9eb47e34d624fc2b3a5b4844da76d6771258343bac81a571285b47994af2581b not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.833634 4834 scope.go:117] "RemoveContainer" containerID="129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.833817 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec"} err="failed to get container status \"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\": rpc error: code = NotFound desc = could not find container \"129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec\": container with ID starting with 129e5a43da633568de27c22c3a8ad822f355fa20ed4f8e268b3b1e7f27bb35ec not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.833842 4834 scope.go:117] "RemoveContainer" containerID="5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.834115 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc"} err="failed to get container status \"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\": rpc error: code = NotFound desc = could not find container \"5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc\": container with ID starting with 5127ab301cf818f8e4b220c41456425a76304bf7d7329856f00ddb9e05c7a1cc not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.834184 4834 scope.go:117] "RemoveContainer" containerID="f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.834426 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1"} err="failed to get container status \"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\": rpc error: code = NotFound desc = could not find container \"f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1\": container with ID starting with f83d0247694f2749ab19e28c5c96a4bf8cba916f41f9303d9c7b9601d69ca9e1 not found: ID does not exist" Oct 08 22:35:20 crc kubenswrapper[4834]: I1008 22:35:20.847928 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2qnn\" (UniqueName: \"kubernetes.io/projected/010c0911-c97b-4712-9b06-1cdd7a88d823-kube-api-access-v2qnn\") pod \"ovnkube-node-zx6gd\" (UID: \"010c0911-c97b-4712-9b06-1cdd7a88d823\") " pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:21 crc kubenswrapper[4834]: I1008 22:35:21.023313 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:21 crc kubenswrapper[4834]: W1008 22:35:21.054550 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod010c0911_c97b_4712_9b06_1cdd7a88d823.slice/crio-e6221ba440817c52a7be4302cdfc3625131b80035624f57d10ebf9c8ea9427ea WatchSource:0}: Error finding container e6221ba440817c52a7be4302cdfc3625131b80035624f57d10ebf9c8ea9427ea: Status 404 returned error can't find the container with id e6221ba440817c52a7be4302cdfc3625131b80035624f57d10ebf9c8ea9427ea Oct 08 22:35:21 crc kubenswrapper[4834]: I1008 22:35:21.641205 4834 generic.go:334] "Generic (PLEG): container finished" podID="010c0911-c97b-4712-9b06-1cdd7a88d823" containerID="c0e080a121949ad38cec76768405053b6fa1117a31db943c805cd7362e3ae174" exitCode=0 Oct 08 22:35:21 crc kubenswrapper[4834]: I1008 22:35:21.641258 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" event={"ID":"010c0911-c97b-4712-9b06-1cdd7a88d823","Type":"ContainerDied","Data":"c0e080a121949ad38cec76768405053b6fa1117a31db943c805cd7362e3ae174"} Oct 08 22:35:21 crc kubenswrapper[4834]: I1008 22:35:21.643301 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" event={"ID":"010c0911-c97b-4712-9b06-1cdd7a88d823","Type":"ContainerStarted","Data":"e6221ba440817c52a7be4302cdfc3625131b80035624f57d10ebf9c8ea9427ea"} Oct 08 22:35:21 crc kubenswrapper[4834]: I1008 22:35:21.647108 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f297z_b150123b-551e-4c12-afa1-0c651719d3f2/kube-multus/2.log" Oct 08 22:35:21 crc kubenswrapper[4834]: I1008 22:35:21.650113 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wrrs9" Oct 08 22:35:21 crc kubenswrapper[4834]: I1008 22:35:21.725549 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wrrs9"] Oct 08 22:35:21 crc kubenswrapper[4834]: I1008 22:35:21.732200 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wrrs9"] Oct 08 22:35:22 crc kubenswrapper[4834]: I1008 22:35:22.660389 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" event={"ID":"010c0911-c97b-4712-9b06-1cdd7a88d823","Type":"ContainerStarted","Data":"faa7e4e0d7a6c32e45505021859bdfda646a0141bd98c3834fc8cc4afca0b756"} Oct 08 22:35:22 crc kubenswrapper[4834]: I1008 22:35:22.660955 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" event={"ID":"010c0911-c97b-4712-9b06-1cdd7a88d823","Type":"ContainerStarted","Data":"935d1f837a063bedfedb8c6b60f4871e6a482cb8e78ce172974712d95a0ae84b"} Oct 08 22:35:22 crc kubenswrapper[4834]: I1008 22:35:22.660978 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" event={"ID":"010c0911-c97b-4712-9b06-1cdd7a88d823","Type":"ContainerStarted","Data":"03516e35c1b3cc9f71e9d1bb0be1a8b78fb56faae1874840763aab9abbc73208"} Oct 08 22:35:22 crc kubenswrapper[4834]: I1008 22:35:22.660998 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" event={"ID":"010c0911-c97b-4712-9b06-1cdd7a88d823","Type":"ContainerStarted","Data":"79ca13102e87379e45d570dec4210b74015ad8e70a7007377493bf84efae2cec"} Oct 08 22:35:22 crc kubenswrapper[4834]: I1008 22:35:22.661015 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" event={"ID":"010c0911-c97b-4712-9b06-1cdd7a88d823","Type":"ContainerStarted","Data":"4dfbf6a7021a1d880619b9bb408b423d2d147d5be45ab8774cec35be111a2d26"} Oct 08 22:35:22 crc kubenswrapper[4834]: I1008 22:35:22.661033 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" event={"ID":"010c0911-c97b-4712-9b06-1cdd7a88d823","Type":"ContainerStarted","Data":"85505e42129b30f9457692d48eae31e0a9c0ca52dbfd6710f1603f28bf2eb188"} Oct 08 22:35:23 crc kubenswrapper[4834]: I1008 22:35:23.564857 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc" path="/var/lib/kubelet/pods/f8fab27a-1d41-4e0d-b81d-a6b9af31ecdc/volumes" Oct 08 22:35:25 crc kubenswrapper[4834]: I1008 22:35:25.702950 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" event={"ID":"010c0911-c97b-4712-9b06-1cdd7a88d823","Type":"ContainerStarted","Data":"695ed7e78784b245dcabec1e80c703eee0a4a953cf72fe76e6079275cd34a128"} Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.109345 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-5z2r9"] Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.110180 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.112984 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.113275 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.113286 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.117262 4834 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-lcqhv" Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.247135 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrfn2\" (UniqueName: \"kubernetes.io/projected/6b99a056-947f-4c55-9725-c25b17fd456b-kube-api-access-qrfn2\") pod \"crc-storage-crc-5z2r9\" (UID: \"6b99a056-947f-4c55-9725-c25b17fd456b\") " pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.247533 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b99a056-947f-4c55-9725-c25b17fd456b-crc-storage\") pod \"crc-storage-crc-5z2r9\" (UID: \"6b99a056-947f-4c55-9725-c25b17fd456b\") " pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.247771 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b99a056-947f-4c55-9725-c25b17fd456b-node-mnt\") pod \"crc-storage-crc-5z2r9\" (UID: \"6b99a056-947f-4c55-9725-c25b17fd456b\") " pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.348807 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b99a056-947f-4c55-9725-c25b17fd456b-crc-storage\") pod \"crc-storage-crc-5z2r9\" (UID: \"6b99a056-947f-4c55-9725-c25b17fd456b\") " pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.349276 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b99a056-947f-4c55-9725-c25b17fd456b-node-mnt\") pod \"crc-storage-crc-5z2r9\" (UID: \"6b99a056-947f-4c55-9725-c25b17fd456b\") " pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.349443 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrfn2\" (UniqueName: \"kubernetes.io/projected/6b99a056-947f-4c55-9725-c25b17fd456b-kube-api-access-qrfn2\") pod \"crc-storage-crc-5z2r9\" (UID: \"6b99a056-947f-4c55-9725-c25b17fd456b\") " pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.349775 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b99a056-947f-4c55-9725-c25b17fd456b-node-mnt\") pod \"crc-storage-crc-5z2r9\" (UID: \"6b99a056-947f-4c55-9725-c25b17fd456b\") " pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.350435 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b99a056-947f-4c55-9725-c25b17fd456b-crc-storage\") pod \"crc-storage-crc-5z2r9\" (UID: \"6b99a056-947f-4c55-9725-c25b17fd456b\") " pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.380412 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrfn2\" (UniqueName: \"kubernetes.io/projected/6b99a056-947f-4c55-9725-c25b17fd456b-kube-api-access-qrfn2\") pod \"crc-storage-crc-5z2r9\" (UID: \"6b99a056-947f-4c55-9725-c25b17fd456b\") " pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:26 crc kubenswrapper[4834]: I1008 22:35:26.440492 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:26 crc kubenswrapper[4834]: E1008 22:35:26.480762 4834 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5z2r9_crc-storage_6b99a056-947f-4c55-9725-c25b17fd456b_0(4f676743b531b192e0da4f6cc8cdb862da78d8495371758b7cea30416aaec819): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 22:35:26 crc kubenswrapper[4834]: E1008 22:35:26.480886 4834 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5z2r9_crc-storage_6b99a056-947f-4c55-9725-c25b17fd456b_0(4f676743b531b192e0da4f6cc8cdb862da78d8495371758b7cea30416aaec819): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:26 crc kubenswrapper[4834]: E1008 22:35:26.480927 4834 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5z2r9_crc-storage_6b99a056-947f-4c55-9725-c25b17fd456b_0(4f676743b531b192e0da4f6cc8cdb862da78d8495371758b7cea30416aaec819): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:26 crc kubenswrapper[4834]: E1008 22:35:26.481066 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-5z2r9_crc-storage(6b99a056-947f-4c55-9725-c25b17fd456b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-5z2r9_crc-storage(6b99a056-947f-4c55-9725-c25b17fd456b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5z2r9_crc-storage_6b99a056-947f-4c55-9725-c25b17fd456b_0(4f676743b531b192e0da4f6cc8cdb862da78d8495371758b7cea30416aaec819): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-5z2r9" podUID="6b99a056-947f-4c55-9725-c25b17fd456b" Oct 08 22:35:27 crc kubenswrapper[4834]: I1008 22:35:27.725486 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" event={"ID":"010c0911-c97b-4712-9b06-1cdd7a88d823","Type":"ContainerStarted","Data":"b19b5ff2689d7d2f7e5f049c22de99729a7c11d8e14f9aa7258ac6239872c0e3"} Oct 08 22:35:27 crc kubenswrapper[4834]: I1008 22:35:27.726109 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:27 crc kubenswrapper[4834]: I1008 22:35:27.726123 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:27 crc kubenswrapper[4834]: I1008 22:35:27.726137 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:27 crc kubenswrapper[4834]: I1008 22:35:27.777594 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" podStartSLOduration=7.777570856 podStartE2EDuration="7.777570856s" podCreationTimestamp="2025-10-08 22:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:35:27.776422087 +0000 UTC m=+735.599306863" watchObservedRunningTime="2025-10-08 22:35:27.777570856 +0000 UTC m=+735.600455612" Oct 08 22:35:27 crc kubenswrapper[4834]: I1008 22:35:27.780417 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:27 crc kubenswrapper[4834]: I1008 22:35:27.790315 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:28 crc kubenswrapper[4834]: I1008 22:35:28.027417 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5z2r9"] Oct 08 22:35:28 crc kubenswrapper[4834]: I1008 22:35:28.027584 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:28 crc kubenswrapper[4834]: I1008 22:35:28.028236 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:28 crc kubenswrapper[4834]: E1008 22:35:28.066793 4834 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5z2r9_crc-storage_6b99a056-947f-4c55-9725-c25b17fd456b_0(f168230d444d1a0b2144a19b2704c3e6d9ba3636d5a8a94b7061871a9589aefb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 08 22:35:28 crc kubenswrapper[4834]: E1008 22:35:28.066896 4834 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5z2r9_crc-storage_6b99a056-947f-4c55-9725-c25b17fd456b_0(f168230d444d1a0b2144a19b2704c3e6d9ba3636d5a8a94b7061871a9589aefb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:28 crc kubenswrapper[4834]: E1008 22:35:28.066944 4834 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5z2r9_crc-storage_6b99a056-947f-4c55-9725-c25b17fd456b_0(f168230d444d1a0b2144a19b2704c3e6d9ba3636d5a8a94b7061871a9589aefb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:28 crc kubenswrapper[4834]: E1008 22:35:28.067027 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-5z2r9_crc-storage(6b99a056-947f-4c55-9725-c25b17fd456b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-5z2r9_crc-storage(6b99a056-947f-4c55-9725-c25b17fd456b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5z2r9_crc-storage_6b99a056-947f-4c55-9725-c25b17fd456b_0(f168230d444d1a0b2144a19b2704c3e6d9ba3636d5a8a94b7061871a9589aefb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-5z2r9" podUID="6b99a056-947f-4c55-9725-c25b17fd456b" Oct 08 22:35:32 crc kubenswrapper[4834]: I1008 22:35:32.555373 4834 scope.go:117] "RemoveContainer" containerID="e8ed8e2d5d9a78bb58ff2b75a8232ded894aa521707893d1b0e1ffa1dc3c4cb0" Oct 08 22:35:32 crc kubenswrapper[4834]: I1008 22:35:32.765548 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f297z_b150123b-551e-4c12-afa1-0c651719d3f2/kube-multus/2.log" Oct 08 22:35:33 crc kubenswrapper[4834]: I1008 22:35:33.776364 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f297z_b150123b-551e-4c12-afa1-0c651719d3f2/kube-multus/2.log" Oct 08 22:35:33 crc kubenswrapper[4834]: I1008 22:35:33.776456 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f297z" event={"ID":"b150123b-551e-4c12-afa1-0c651719d3f2","Type":"ContainerStarted","Data":"63bc2b02244f8dd0d8ea6178df910fdcc75e1d1701f3c76dccfed9c516d5e284"} Oct 08 22:35:38 crc kubenswrapper[4834]: I1008 22:35:38.555393 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:38 crc kubenswrapper[4834]: I1008 22:35:38.556400 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:38 crc kubenswrapper[4834]: I1008 22:35:38.868765 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5z2r9"] Oct 08 22:35:38 crc kubenswrapper[4834]: I1008 22:35:38.879254 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:35:39 crc kubenswrapper[4834]: I1008 22:35:39.834410 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5z2r9" event={"ID":"6b99a056-947f-4c55-9725-c25b17fd456b","Type":"ContainerStarted","Data":"a62d40bb95b405adcad3d6e0e4cf788f5d936269ec64026a5f22d114708e3386"} Oct 08 22:35:41 crc kubenswrapper[4834]: I1008 22:35:41.849082 4834 generic.go:334] "Generic (PLEG): container finished" podID="6b99a056-947f-4c55-9725-c25b17fd456b" containerID="e37445aeeb5f166a9554dec77d4ddefca9ea6ccfa3e6d4d0489d439278a08fe5" exitCode=0 Oct 08 22:35:41 crc kubenswrapper[4834]: I1008 22:35:41.849170 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5z2r9" event={"ID":"6b99a056-947f-4c55-9725-c25b17fd456b","Type":"ContainerDied","Data":"e37445aeeb5f166a9554dec77d4ddefca9ea6ccfa3e6d4d0489d439278a08fe5"} Oct 08 22:35:43 crc kubenswrapper[4834]: I1008 22:35:43.200806 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:43 crc kubenswrapper[4834]: I1008 22:35:43.309336 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrfn2\" (UniqueName: \"kubernetes.io/projected/6b99a056-947f-4c55-9725-c25b17fd456b-kube-api-access-qrfn2\") pod \"6b99a056-947f-4c55-9725-c25b17fd456b\" (UID: \"6b99a056-947f-4c55-9725-c25b17fd456b\") " Oct 08 22:35:43 crc kubenswrapper[4834]: I1008 22:35:43.309447 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b99a056-947f-4c55-9725-c25b17fd456b-crc-storage\") pod \"6b99a056-947f-4c55-9725-c25b17fd456b\" (UID: \"6b99a056-947f-4c55-9725-c25b17fd456b\") " Oct 08 22:35:43 crc kubenswrapper[4834]: I1008 22:35:43.309515 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b99a056-947f-4c55-9725-c25b17fd456b-node-mnt\") pod \"6b99a056-947f-4c55-9725-c25b17fd456b\" (UID: \"6b99a056-947f-4c55-9725-c25b17fd456b\") " Oct 08 22:35:43 crc kubenswrapper[4834]: I1008 22:35:43.309878 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b99a056-947f-4c55-9725-c25b17fd456b-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6b99a056-947f-4c55-9725-c25b17fd456b" (UID: "6b99a056-947f-4c55-9725-c25b17fd456b"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:35:43 crc kubenswrapper[4834]: I1008 22:35:43.317141 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b99a056-947f-4c55-9725-c25b17fd456b-kube-api-access-qrfn2" (OuterVolumeSpecName: "kube-api-access-qrfn2") pod "6b99a056-947f-4c55-9725-c25b17fd456b" (UID: "6b99a056-947f-4c55-9725-c25b17fd456b"). InnerVolumeSpecName "kube-api-access-qrfn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:35:43 crc kubenswrapper[4834]: I1008 22:35:43.325332 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b99a056-947f-4c55-9725-c25b17fd456b-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6b99a056-947f-4c55-9725-c25b17fd456b" (UID: "6b99a056-947f-4c55-9725-c25b17fd456b"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:35:43 crc kubenswrapper[4834]: I1008 22:35:43.411851 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrfn2\" (UniqueName: \"kubernetes.io/projected/6b99a056-947f-4c55-9725-c25b17fd456b-kube-api-access-qrfn2\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:43 crc kubenswrapper[4834]: I1008 22:35:43.412445 4834 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6b99a056-947f-4c55-9725-c25b17fd456b-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:43 crc kubenswrapper[4834]: I1008 22:35:43.412474 4834 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6b99a056-947f-4c55-9725-c25b17fd456b-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:43 crc kubenswrapper[4834]: I1008 22:35:43.866223 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5z2r9" event={"ID":"6b99a056-947f-4c55-9725-c25b17fd456b","Type":"ContainerDied","Data":"a62d40bb95b405adcad3d6e0e4cf788f5d936269ec64026a5f22d114708e3386"} Oct 08 22:35:43 crc kubenswrapper[4834]: I1008 22:35:43.866291 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5z2r9" Oct 08 22:35:43 crc kubenswrapper[4834]: I1008 22:35:43.866297 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a62d40bb95b405adcad3d6e0e4cf788f5d936269ec64026a5f22d114708e3386" Oct 08 22:35:47 crc kubenswrapper[4834]: I1008 22:35:47.025272 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:35:47 crc kubenswrapper[4834]: I1008 22:35:47.027286 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:35:49 crc kubenswrapper[4834]: I1008 22:35:49.955682 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc"] Oct 08 22:35:49 crc kubenswrapper[4834]: E1008 22:35:49.956016 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b99a056-947f-4c55-9725-c25b17fd456b" containerName="storage" Oct 08 22:35:49 crc kubenswrapper[4834]: I1008 22:35:49.956035 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b99a056-947f-4c55-9725-c25b17fd456b" containerName="storage" Oct 08 22:35:49 crc kubenswrapper[4834]: I1008 22:35:49.956172 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b99a056-947f-4c55-9725-c25b17fd456b" containerName="storage" Oct 08 22:35:49 crc kubenswrapper[4834]: I1008 22:35:49.957025 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" Oct 08 22:35:49 crc kubenswrapper[4834]: I1008 22:35:49.959383 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 22:35:49 crc kubenswrapper[4834]: I1008 22:35:49.969855 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc"] Oct 08 22:35:50 crc kubenswrapper[4834]: I1008 22:35:50.106479 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6c7aec8-9631-4ba0-87af-691c252bd8f1-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc\" (UID: \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" Oct 08 22:35:50 crc kubenswrapper[4834]: I1008 22:35:50.106564 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6c7aec8-9631-4ba0-87af-691c252bd8f1-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc\" (UID: \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" Oct 08 22:35:50 crc kubenswrapper[4834]: I1008 22:35:50.106648 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5th2q\" (UniqueName: \"kubernetes.io/projected/c6c7aec8-9631-4ba0-87af-691c252bd8f1-kube-api-access-5th2q\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc\" (UID: \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" Oct 08 22:35:50 crc kubenswrapper[4834]: I1008 22:35:50.208045 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6c7aec8-9631-4ba0-87af-691c252bd8f1-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc\" (UID: \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" Oct 08 22:35:50 crc kubenswrapper[4834]: I1008 22:35:50.208235 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5th2q\" (UniqueName: \"kubernetes.io/projected/c6c7aec8-9631-4ba0-87af-691c252bd8f1-kube-api-access-5th2q\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc\" (UID: \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" Oct 08 22:35:50 crc kubenswrapper[4834]: I1008 22:35:50.208303 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6c7aec8-9631-4ba0-87af-691c252bd8f1-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc\" (UID: \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" Oct 08 22:35:50 crc kubenswrapper[4834]: I1008 22:35:50.208855 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6c7aec8-9631-4ba0-87af-691c252bd8f1-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc\" (UID: \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" Oct 08 22:35:50 crc kubenswrapper[4834]: I1008 22:35:50.208898 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6c7aec8-9631-4ba0-87af-691c252bd8f1-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc\" (UID: \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" Oct 08 22:35:50 crc kubenswrapper[4834]: I1008 22:35:50.250741 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5th2q\" (UniqueName: \"kubernetes.io/projected/c6c7aec8-9631-4ba0-87af-691c252bd8f1-kube-api-access-5th2q\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc\" (UID: \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" Oct 08 22:35:50 crc kubenswrapper[4834]: I1008 22:35:50.274358 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" Oct 08 22:35:50 crc kubenswrapper[4834]: I1008 22:35:50.456592 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc"] Oct 08 22:35:50 crc kubenswrapper[4834]: W1008 22:35:50.466027 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6c7aec8_9631_4ba0_87af_691c252bd8f1.slice/crio-b7c7176a2db80c77d03e1788cc5e9111bbd2e47444d966256722fe7bcff99a88 WatchSource:0}: Error finding container b7c7176a2db80c77d03e1788cc5e9111bbd2e47444d966256722fe7bcff99a88: Status 404 returned error can't find the container with id b7c7176a2db80c77d03e1788cc5e9111bbd2e47444d966256722fe7bcff99a88 Oct 08 22:35:50 crc kubenswrapper[4834]: I1008 22:35:50.926342 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" event={"ID":"c6c7aec8-9631-4ba0-87af-691c252bd8f1","Type":"ContainerStarted","Data":"b34102382624af9df1c910fd94b9b6cbba1db99fbd1a0a501a3ce8005b8c5d68"} Oct 08 22:35:50 crc kubenswrapper[4834]: I1008 22:35:50.926398 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" event={"ID":"c6c7aec8-9631-4ba0-87af-691c252bd8f1","Type":"ContainerStarted","Data":"b7c7176a2db80c77d03e1788cc5e9111bbd2e47444d966256722fe7bcff99a88"} Oct 08 22:35:51 crc kubenswrapper[4834]: I1008 22:35:51.063597 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zx6gd" Oct 08 22:35:51 crc kubenswrapper[4834]: I1008 22:35:51.934518 4834 generic.go:334] "Generic (PLEG): container finished" podID="c6c7aec8-9631-4ba0-87af-691c252bd8f1" containerID="b34102382624af9df1c910fd94b9b6cbba1db99fbd1a0a501a3ce8005b8c5d68" exitCode=0 Oct 08 22:35:51 crc kubenswrapper[4834]: I1008 22:35:51.934611 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" event={"ID":"c6c7aec8-9631-4ba0-87af-691c252bd8f1","Type":"ContainerDied","Data":"b34102382624af9df1c910fd94b9b6cbba1db99fbd1a0a501a3ce8005b8c5d68"} Oct 08 22:35:51 crc kubenswrapper[4834]: I1008 22:35:51.976465 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b2cdc"] Oct 08 22:35:51 crc kubenswrapper[4834]: I1008 22:35:51.976728 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" podUID="88947829-240b-4ebf-9125-c03a9e4fd9df" containerName="controller-manager" containerID="cri-o://e552e385de5df8727f317cd27c7a7c1dbf2b469ec904a5bdd82f527816c25f98" gracePeriod=30 Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.047252 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl"] Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.047476 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" podUID="6a567e9e-5bea-48c9-ad1e-5ed2332f0341" containerName="route-controller-manager" containerID="cri-o://e2902d70b98b5442828b88e375972daf16e8bb4ef6c59fd02f4e74112a89f6f9" gracePeriod=30 Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.420909 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.424819 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.540786 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbqtl\" (UniqueName: \"kubernetes.io/projected/88947829-240b-4ebf-9125-c03a9e4fd9df-kube-api-access-vbqtl\") pod \"88947829-240b-4ebf-9125-c03a9e4fd9df\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.541182 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-client-ca\") pod \"88947829-240b-4ebf-9125-c03a9e4fd9df\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.541813 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-client-ca" (OuterVolumeSpecName: "client-ca") pod "88947829-240b-4ebf-9125-c03a9e4fd9df" (UID: "88947829-240b-4ebf-9125-c03a9e4fd9df"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.541851 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-proxy-ca-bundles\") pod \"88947829-240b-4ebf-9125-c03a9e4fd9df\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.541937 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-config\") pod \"88947829-240b-4ebf-9125-c03a9e4fd9df\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.541930 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "88947829-240b-4ebf-9125-c03a9e4fd9df" (UID: "88947829-240b-4ebf-9125-c03a9e4fd9df"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.541959 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-config\") pod \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.542069 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-serving-cert\") pod \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.542110 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-client-ca\") pod \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.542215 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrftz\" (UniqueName: \"kubernetes.io/projected/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-kube-api-access-qrftz\") pod \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\" (UID: \"6a567e9e-5bea-48c9-ad1e-5ed2332f0341\") " Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.542242 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88947829-240b-4ebf-9125-c03a9e4fd9df-serving-cert\") pod \"88947829-240b-4ebf-9125-c03a9e4fd9df\" (UID: \"88947829-240b-4ebf-9125-c03a9e4fd9df\") " Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.542428 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-config" (OuterVolumeSpecName: "config") pod "88947829-240b-4ebf-9125-c03a9e4fd9df" (UID: "88947829-240b-4ebf-9125-c03a9e4fd9df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.542451 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.542469 4834 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.542921 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-config" (OuterVolumeSpecName: "config") pod "6a567e9e-5bea-48c9-ad1e-5ed2332f0341" (UID: "6a567e9e-5bea-48c9-ad1e-5ed2332f0341"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.542994 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-client-ca" (OuterVolumeSpecName: "client-ca") pod "6a567e9e-5bea-48c9-ad1e-5ed2332f0341" (UID: "6a567e9e-5bea-48c9-ad1e-5ed2332f0341"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.547675 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88947829-240b-4ebf-9125-c03a9e4fd9df-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "88947829-240b-4ebf-9125-c03a9e4fd9df" (UID: "88947829-240b-4ebf-9125-c03a9e4fd9df"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.547719 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6a567e9e-5bea-48c9-ad1e-5ed2332f0341" (UID: "6a567e9e-5bea-48c9-ad1e-5ed2332f0341"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.547746 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-kube-api-access-qrftz" (OuterVolumeSpecName: "kube-api-access-qrftz") pod "6a567e9e-5bea-48c9-ad1e-5ed2332f0341" (UID: "6a567e9e-5bea-48c9-ad1e-5ed2332f0341"). InnerVolumeSpecName "kube-api-access-qrftz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.549984 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88947829-240b-4ebf-9125-c03a9e4fd9df-kube-api-access-vbqtl" (OuterVolumeSpecName: "kube-api-access-vbqtl") pod "88947829-240b-4ebf-9125-c03a9e4fd9df" (UID: "88947829-240b-4ebf-9125-c03a9e4fd9df"). InnerVolumeSpecName "kube-api-access-vbqtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.643266 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88947829-240b-4ebf-9125-c03a9e4fd9df-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.643587 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.643644 4834 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-client-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.643696 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.643744 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrftz\" (UniqueName: \"kubernetes.io/projected/6a567e9e-5bea-48c9-ad1e-5ed2332f0341-kube-api-access-qrftz\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.643797 4834 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88947829-240b-4ebf-9125-c03a9e4fd9df-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.643854 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbqtl\" (UniqueName: \"kubernetes.io/projected/88947829-240b-4ebf-9125-c03a9e4fd9df-kube-api-access-vbqtl\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.941716 4834 generic.go:334] "Generic (PLEG): container finished" podID="6a567e9e-5bea-48c9-ad1e-5ed2332f0341" containerID="e2902d70b98b5442828b88e375972daf16e8bb4ef6c59fd02f4e74112a89f6f9" exitCode=0 Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.942066 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" event={"ID":"6a567e9e-5bea-48c9-ad1e-5ed2332f0341","Type":"ContainerDied","Data":"e2902d70b98b5442828b88e375972daf16e8bb4ef6c59fd02f4e74112a89f6f9"} Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.942212 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" event={"ID":"6a567e9e-5bea-48c9-ad1e-5ed2332f0341","Type":"ContainerDied","Data":"1f779fa66ee6aeebd219965c219eaee9ddc04f52c598506a6da59be493162c08"} Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.942103 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.942239 4834 scope.go:117] "RemoveContainer" containerID="e2902d70b98b5442828b88e375972daf16e8bb4ef6c59fd02f4e74112a89f6f9" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.943436 4834 generic.go:334] "Generic (PLEG): container finished" podID="88947829-240b-4ebf-9125-c03a9e4fd9df" containerID="e552e385de5df8727f317cd27c7a7c1dbf2b469ec904a5bdd82f527816c25f98" exitCode=0 Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.943479 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" event={"ID":"88947829-240b-4ebf-9125-c03a9e4fd9df","Type":"ContainerDied","Data":"e552e385de5df8727f317cd27c7a7c1dbf2b469ec904a5bdd82f527816c25f98"} Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.943573 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" event={"ID":"88947829-240b-4ebf-9125-c03a9e4fd9df","Type":"ContainerDied","Data":"1225fc322df52ff9dd9be7bcb0dd1e27f8337b969472620c23b57dc17b01249e"} Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.943621 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-b2cdc" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.961063 4834 scope.go:117] "RemoveContainer" containerID="e2902d70b98b5442828b88e375972daf16e8bb4ef6c59fd02f4e74112a89f6f9" Oct 08 22:35:52 crc kubenswrapper[4834]: E1008 22:35:52.961569 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2902d70b98b5442828b88e375972daf16e8bb4ef6c59fd02f4e74112a89f6f9\": container with ID starting with e2902d70b98b5442828b88e375972daf16e8bb4ef6c59fd02f4e74112a89f6f9 not found: ID does not exist" containerID="e2902d70b98b5442828b88e375972daf16e8bb4ef6c59fd02f4e74112a89f6f9" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.961670 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2902d70b98b5442828b88e375972daf16e8bb4ef6c59fd02f4e74112a89f6f9"} err="failed to get container status \"e2902d70b98b5442828b88e375972daf16e8bb4ef6c59fd02f4e74112a89f6f9\": rpc error: code = NotFound desc = could not find container \"e2902d70b98b5442828b88e375972daf16e8bb4ef6c59fd02f4e74112a89f6f9\": container with ID starting with e2902d70b98b5442828b88e375972daf16e8bb4ef6c59fd02f4e74112a89f6f9 not found: ID does not exist" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.961755 4834 scope.go:117] "RemoveContainer" containerID="e552e385de5df8727f317cd27c7a7c1dbf2b469ec904a5bdd82f527816c25f98" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.980790 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b2cdc"] Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.989355 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-b2cdc"] Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.989650 4834 scope.go:117] "RemoveContainer" containerID="e552e385de5df8727f317cd27c7a7c1dbf2b469ec904a5bdd82f527816c25f98" Oct 08 22:35:52 crc kubenswrapper[4834]: E1008 22:35:52.993329 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e552e385de5df8727f317cd27c7a7c1dbf2b469ec904a5bdd82f527816c25f98\": container with ID starting with e552e385de5df8727f317cd27c7a7c1dbf2b469ec904a5bdd82f527816c25f98 not found: ID does not exist" containerID="e552e385de5df8727f317cd27c7a7c1dbf2b469ec904a5bdd82f527816c25f98" Oct 08 22:35:52 crc kubenswrapper[4834]: I1008 22:35:52.993395 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e552e385de5df8727f317cd27c7a7c1dbf2b469ec904a5bdd82f527816c25f98"} err="failed to get container status \"e552e385de5df8727f317cd27c7a7c1dbf2b469ec904a5bdd82f527816c25f98\": rpc error: code = NotFound desc = could not find container \"e552e385de5df8727f317cd27c7a7c1dbf2b469ec904a5bdd82f527816c25f98\": container with ID starting with e552e385de5df8727f317cd27c7a7c1dbf2b469ec904a5bdd82f527816c25f98 not found: ID does not exist" Oct 08 22:35:53 crc kubenswrapper[4834]: I1008 22:35:53.000975 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl"] Oct 08 22:35:53 crc kubenswrapper[4834]: I1008 22:35:53.006009 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mn7cl"] Oct 08 22:35:53 crc kubenswrapper[4834]: I1008 22:35:53.567237 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a567e9e-5bea-48c9-ad1e-5ed2332f0341" path="/var/lib/kubelet/pods/6a567e9e-5bea-48c9-ad1e-5ed2332f0341/volumes" Oct 08 22:35:53 crc kubenswrapper[4834]: I1008 22:35:53.568495 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88947829-240b-4ebf-9125-c03a9e4fd9df" path="/var/lib/kubelet/pods/88947829-240b-4ebf-9125-c03a9e4fd9df/volumes" Oct 08 22:35:53 crc kubenswrapper[4834]: I1008 22:35:53.950134 4834 generic.go:334] "Generic (PLEG): container finished" podID="c6c7aec8-9631-4ba0-87af-691c252bd8f1" containerID="0b00f6cbb8f99b021870dc736ba8a10079d7dc821e0a6f0c80ee4bc71db2dc46" exitCode=0 Oct 08 22:35:53 crc kubenswrapper[4834]: I1008 22:35:53.950269 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" event={"ID":"c6c7aec8-9631-4ba0-87af-691c252bd8f1","Type":"ContainerDied","Data":"0b00f6cbb8f99b021870dc736ba8a10079d7dc821e0a6f0c80ee4bc71db2dc46"} Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.137025 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8449586769-l9ncg"] Oct 08 22:35:54 crc kubenswrapper[4834]: E1008 22:35:54.137446 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a567e9e-5bea-48c9-ad1e-5ed2332f0341" containerName="route-controller-manager" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.137485 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a567e9e-5bea-48c9-ad1e-5ed2332f0341" containerName="route-controller-manager" Oct 08 22:35:54 crc kubenswrapper[4834]: E1008 22:35:54.137521 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88947829-240b-4ebf-9125-c03a9e4fd9df" containerName="controller-manager" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.137535 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="88947829-240b-4ebf-9125-c03a9e4fd9df" containerName="controller-manager" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.137699 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="88947829-240b-4ebf-9125-c03a9e4fd9df" containerName="controller-manager" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.137736 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a567e9e-5bea-48c9-ad1e-5ed2332f0341" containerName="route-controller-manager" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.138333 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.141713 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.142491 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.142729 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.143001 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.143694 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.143737 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.145969 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp"] Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.146855 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.152233 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8449586769-l9ncg"] Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.155841 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.156103 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.156295 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.171470 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.171801 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.177832 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.190245 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb11c125-1257-4f0a-9150-748390cc6fbd-config\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.190577 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb11c125-1257-4f0a-9150-748390cc6fbd-proxy-ca-bundles\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.190677 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-585xp\" (UniqueName: \"kubernetes.io/projected/cb11c125-1257-4f0a-9150-748390cc6fbd-kube-api-access-585xp\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.190765 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb11c125-1257-4f0a-9150-748390cc6fbd-client-ca\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.190921 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb11c125-1257-4f0a-9150-748390cc6fbd-serving-cert\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.192462 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp"] Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.196332 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.292024 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wllmp\" (UniqueName: \"kubernetes.io/projected/d8ca1e51-0164-4a56-b830-dbc8945debdd-kube-api-access-wllmp\") pod \"route-controller-manager-5c5f88f9f4-5xhpp\" (UID: \"d8ca1e51-0164-4a56-b830-dbc8945debdd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.292100 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8ca1e51-0164-4a56-b830-dbc8945debdd-config\") pod \"route-controller-manager-5c5f88f9f4-5xhpp\" (UID: \"d8ca1e51-0164-4a56-b830-dbc8945debdd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.292132 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8ca1e51-0164-4a56-b830-dbc8945debdd-client-ca\") pod \"route-controller-manager-5c5f88f9f4-5xhpp\" (UID: \"d8ca1e51-0164-4a56-b830-dbc8945debdd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.292225 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb11c125-1257-4f0a-9150-748390cc6fbd-proxy-ca-bundles\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.292262 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-585xp\" (UniqueName: \"kubernetes.io/projected/cb11c125-1257-4f0a-9150-748390cc6fbd-kube-api-access-585xp\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.292300 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb11c125-1257-4f0a-9150-748390cc6fbd-client-ca\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.292360 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb11c125-1257-4f0a-9150-748390cc6fbd-serving-cert\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.292406 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8ca1e51-0164-4a56-b830-dbc8945debdd-serving-cert\") pod \"route-controller-manager-5c5f88f9f4-5xhpp\" (UID: \"d8ca1e51-0164-4a56-b830-dbc8945debdd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.292442 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb11c125-1257-4f0a-9150-748390cc6fbd-config\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.293937 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb11c125-1257-4f0a-9150-748390cc6fbd-client-ca\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.294372 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb11c125-1257-4f0a-9150-748390cc6fbd-proxy-ca-bundles\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.294487 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb11c125-1257-4f0a-9150-748390cc6fbd-config\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.299906 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb11c125-1257-4f0a-9150-748390cc6fbd-serving-cert\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.310282 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-585xp\" (UniqueName: \"kubernetes.io/projected/cb11c125-1257-4f0a-9150-748390cc6fbd-kube-api-access-585xp\") pod \"controller-manager-8449586769-l9ncg\" (UID: \"cb11c125-1257-4f0a-9150-748390cc6fbd\") " pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.393565 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8ca1e51-0164-4a56-b830-dbc8945debdd-serving-cert\") pod \"route-controller-manager-5c5f88f9f4-5xhpp\" (UID: \"d8ca1e51-0164-4a56-b830-dbc8945debdd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.393664 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wllmp\" (UniqueName: \"kubernetes.io/projected/d8ca1e51-0164-4a56-b830-dbc8945debdd-kube-api-access-wllmp\") pod \"route-controller-manager-5c5f88f9f4-5xhpp\" (UID: \"d8ca1e51-0164-4a56-b830-dbc8945debdd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.393706 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8ca1e51-0164-4a56-b830-dbc8945debdd-config\") pod \"route-controller-manager-5c5f88f9f4-5xhpp\" (UID: \"d8ca1e51-0164-4a56-b830-dbc8945debdd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.393735 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8ca1e51-0164-4a56-b830-dbc8945debdd-client-ca\") pod \"route-controller-manager-5c5f88f9f4-5xhpp\" (UID: \"d8ca1e51-0164-4a56-b830-dbc8945debdd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.394988 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8ca1e51-0164-4a56-b830-dbc8945debdd-config\") pod \"route-controller-manager-5c5f88f9f4-5xhpp\" (UID: \"d8ca1e51-0164-4a56-b830-dbc8945debdd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.395059 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d8ca1e51-0164-4a56-b830-dbc8945debdd-client-ca\") pod \"route-controller-manager-5c5f88f9f4-5xhpp\" (UID: \"d8ca1e51-0164-4a56-b830-dbc8945debdd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.398825 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8ca1e51-0164-4a56-b830-dbc8945debdd-serving-cert\") pod \"route-controller-manager-5c5f88f9f4-5xhpp\" (UID: \"d8ca1e51-0164-4a56-b830-dbc8945debdd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.419756 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wllmp\" (UniqueName: \"kubernetes.io/projected/d8ca1e51-0164-4a56-b830-dbc8945debdd-kube-api-access-wllmp\") pod \"route-controller-manager-5c5f88f9f4-5xhpp\" (UID: \"d8ca1e51-0164-4a56-b830-dbc8945debdd\") " pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.479857 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.500968 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.727745 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8449586769-l9ncg"] Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.766499 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp"] Oct 08 22:35:54 crc kubenswrapper[4834]: W1008 22:35:54.771521 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8ca1e51_0164_4a56_b830_dbc8945debdd.slice/crio-26bf9c55a371f3fa6bc5b0e5a7df5d80a9f230a50511e65f13f7060a99dbfc48 WatchSource:0}: Error finding container 26bf9c55a371f3fa6bc5b0e5a7df5d80a9f230a50511e65f13f7060a99dbfc48: Status 404 returned error can't find the container with id 26bf9c55a371f3fa6bc5b0e5a7df5d80a9f230a50511e65f13f7060a99dbfc48 Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.961880 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" event={"ID":"cb11c125-1257-4f0a-9150-748390cc6fbd","Type":"ContainerStarted","Data":"10984f6d01dfca441ab9d9eea96cec094eb42b4ef50ae4ab5b743bf93f2a2769"} Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.961950 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" event={"ID":"cb11c125-1257-4f0a-9150-748390cc6fbd","Type":"ContainerStarted","Data":"aa30c1f149b93814b92945fce2727493dab67139dcd7014397de61b5d8429f13"} Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.962166 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.963570 4834 patch_prober.go:28] interesting pod/controller-manager-8449586769-l9ncg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.963624 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" podUID="cb11c125-1257-4f0a-9150-748390cc6fbd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.967351 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" event={"ID":"d8ca1e51-0164-4a56-b830-dbc8945debdd","Type":"ContainerStarted","Data":"6070161b163952e8d66b1270ab3c9eaa348719edc4d275f29f614db0c2fd6d96"} Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.967398 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" event={"ID":"d8ca1e51-0164-4a56-b830-dbc8945debdd","Type":"ContainerStarted","Data":"26bf9c55a371f3fa6bc5b0e5a7df5d80a9f230a50511e65f13f7060a99dbfc48"} Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.967575 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.972070 4834 generic.go:334] "Generic (PLEG): container finished" podID="c6c7aec8-9631-4ba0-87af-691c252bd8f1" containerID="dcb5b4cddcff6190557d50b247b8627d666247a9703aa286320fed8bbabd7bd9" exitCode=0 Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.972104 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" event={"ID":"c6c7aec8-9631-4ba0-87af-691c252bd8f1","Type":"ContainerDied","Data":"dcb5b4cddcff6190557d50b247b8627d666247a9703aa286320fed8bbabd7bd9"} Oct 08 22:35:54 crc kubenswrapper[4834]: I1008 22:35:54.986986 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" podStartSLOduration=2.986963519 podStartE2EDuration="2.986963519s" podCreationTimestamp="2025-10-08 22:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:35:54.986231491 +0000 UTC m=+762.809116247" watchObservedRunningTime="2025-10-08 22:35:54.986963519 +0000 UTC m=+762.809848275" Oct 08 22:35:55 crc kubenswrapper[4834]: I1008 22:35:55.013898 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" podStartSLOduration=3.013876389 podStartE2EDuration="3.013876389s" podCreationTimestamp="2025-10-08 22:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:35:55.011962543 +0000 UTC m=+762.834847279" watchObservedRunningTime="2025-10-08 22:35:55.013876389 +0000 UTC m=+762.836761135" Oct 08 22:35:55 crc kubenswrapper[4834]: I1008 22:35:55.250605 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c5f88f9f4-5xhpp" Oct 08 22:35:55 crc kubenswrapper[4834]: I1008 22:35:55.982989 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8449586769-l9ncg" Oct 08 22:35:56 crc kubenswrapper[4834]: I1008 22:35:56.246131 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" Oct 08 22:35:56 crc kubenswrapper[4834]: I1008 22:35:56.417461 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6c7aec8-9631-4ba0-87af-691c252bd8f1-bundle\") pod \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\" (UID: \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\") " Oct 08 22:35:56 crc kubenswrapper[4834]: I1008 22:35:56.417552 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5th2q\" (UniqueName: \"kubernetes.io/projected/c6c7aec8-9631-4ba0-87af-691c252bd8f1-kube-api-access-5th2q\") pod \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\" (UID: \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\") " Oct 08 22:35:56 crc kubenswrapper[4834]: I1008 22:35:56.417668 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6c7aec8-9631-4ba0-87af-691c252bd8f1-util\") pod \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\" (UID: \"c6c7aec8-9631-4ba0-87af-691c252bd8f1\") " Oct 08 22:35:56 crc kubenswrapper[4834]: I1008 22:35:56.418766 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6c7aec8-9631-4ba0-87af-691c252bd8f1-bundle" (OuterVolumeSpecName: "bundle") pod "c6c7aec8-9631-4ba0-87af-691c252bd8f1" (UID: "c6c7aec8-9631-4ba0-87af-691c252bd8f1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:35:56 crc kubenswrapper[4834]: I1008 22:35:56.424493 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c7aec8-9631-4ba0-87af-691c252bd8f1-kube-api-access-5th2q" (OuterVolumeSpecName: "kube-api-access-5th2q") pod "c6c7aec8-9631-4ba0-87af-691c252bd8f1" (UID: "c6c7aec8-9631-4ba0-87af-691c252bd8f1"). InnerVolumeSpecName "kube-api-access-5th2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:35:56 crc kubenswrapper[4834]: I1008 22:35:56.449924 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6c7aec8-9631-4ba0-87af-691c252bd8f1-util" (OuterVolumeSpecName: "util") pod "c6c7aec8-9631-4ba0-87af-691c252bd8f1" (UID: "c6c7aec8-9631-4ba0-87af-691c252bd8f1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:35:56 crc kubenswrapper[4834]: I1008 22:35:56.519085 4834 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6c7aec8-9631-4ba0-87af-691c252bd8f1-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:56 crc kubenswrapper[4834]: I1008 22:35:56.519117 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5th2q\" (UniqueName: \"kubernetes.io/projected/c6c7aec8-9631-4ba0-87af-691c252bd8f1-kube-api-access-5th2q\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:56 crc kubenswrapper[4834]: I1008 22:35:56.519128 4834 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6c7aec8-9631-4ba0-87af-691c252bd8f1-util\") on node \"crc\" DevicePath \"\"" Oct 08 22:35:56 crc kubenswrapper[4834]: I1008 22:35:56.988386 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" event={"ID":"c6c7aec8-9631-4ba0-87af-691c252bd8f1","Type":"ContainerDied","Data":"b7c7176a2db80c77d03e1788cc5e9111bbd2e47444d966256722fe7bcff99a88"} Oct 08 22:35:56 crc kubenswrapper[4834]: I1008 22:35:56.989022 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7c7176a2db80c77d03e1788cc5e9111bbd2e47444d966256722fe7bcff99a88" Oct 08 22:35:56 crc kubenswrapper[4834]: I1008 22:35:56.988631 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc" Oct 08 22:35:58 crc kubenswrapper[4834]: I1008 22:35:58.854866 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-9hlt4"] Oct 08 22:35:58 crc kubenswrapper[4834]: E1008 22:35:58.855119 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c7aec8-9631-4ba0-87af-691c252bd8f1" containerName="util" Oct 08 22:35:58 crc kubenswrapper[4834]: I1008 22:35:58.855134 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c7aec8-9631-4ba0-87af-691c252bd8f1" containerName="util" Oct 08 22:35:58 crc kubenswrapper[4834]: E1008 22:35:58.855180 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c7aec8-9631-4ba0-87af-691c252bd8f1" containerName="pull" Oct 08 22:35:58 crc kubenswrapper[4834]: I1008 22:35:58.855188 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c7aec8-9631-4ba0-87af-691c252bd8f1" containerName="pull" Oct 08 22:35:58 crc kubenswrapper[4834]: E1008 22:35:58.855200 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c7aec8-9631-4ba0-87af-691c252bd8f1" containerName="extract" Oct 08 22:35:58 crc kubenswrapper[4834]: I1008 22:35:58.855208 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c7aec8-9631-4ba0-87af-691c252bd8f1" containerName="extract" Oct 08 22:35:58 crc kubenswrapper[4834]: I1008 22:35:58.855330 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c7aec8-9631-4ba0-87af-691c252bd8f1" containerName="extract" Oct 08 22:35:58 crc kubenswrapper[4834]: I1008 22:35:58.855816 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-9hlt4" Oct 08 22:35:58 crc kubenswrapper[4834]: I1008 22:35:58.858239 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-bwhxw" Oct 08 22:35:58 crc kubenswrapper[4834]: I1008 22:35:58.858890 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 08 22:35:58 crc kubenswrapper[4834]: I1008 22:35:58.859222 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 08 22:35:58 crc kubenswrapper[4834]: I1008 22:35:58.869738 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-9hlt4"] Oct 08 22:35:58 crc kubenswrapper[4834]: I1008 22:35:58.955589 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qdgb\" (UniqueName: \"kubernetes.io/projected/8d63afa4-5c31-460c-9409-4752cbe62b7b-kube-api-access-2qdgb\") pod \"nmstate-operator-858ddd8f98-9hlt4\" (UID: \"8d63afa4-5c31-460c-9409-4752cbe62b7b\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-9hlt4" Oct 08 22:35:59 crc kubenswrapper[4834]: I1008 22:35:59.056684 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qdgb\" (UniqueName: \"kubernetes.io/projected/8d63afa4-5c31-460c-9409-4752cbe62b7b-kube-api-access-2qdgb\") pod \"nmstate-operator-858ddd8f98-9hlt4\" (UID: \"8d63afa4-5c31-460c-9409-4752cbe62b7b\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-9hlt4" Oct 08 22:35:59 crc kubenswrapper[4834]: I1008 22:35:59.093963 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qdgb\" (UniqueName: \"kubernetes.io/projected/8d63afa4-5c31-460c-9409-4752cbe62b7b-kube-api-access-2qdgb\") pod \"nmstate-operator-858ddd8f98-9hlt4\" (UID: \"8d63afa4-5c31-460c-9409-4752cbe62b7b\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-9hlt4" Oct 08 22:35:59 crc kubenswrapper[4834]: I1008 22:35:59.173588 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-9hlt4" Oct 08 22:35:59 crc kubenswrapper[4834]: I1008 22:35:59.617180 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-9hlt4"] Oct 08 22:36:00 crc kubenswrapper[4834]: I1008 22:36:00.017469 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-9hlt4" event={"ID":"8d63afa4-5c31-460c-9409-4752cbe62b7b","Type":"ContainerStarted","Data":"7bc134df8dc3e5e2b7e899831946cee61a184008367d43eb0d5b582cda2fe9b3"} Oct 08 22:36:00 crc kubenswrapper[4834]: I1008 22:36:00.873038 4834 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 08 22:36:02 crc kubenswrapper[4834]: I1008 22:36:02.033023 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-9hlt4" event={"ID":"8d63afa4-5c31-460c-9409-4752cbe62b7b","Type":"ContainerStarted","Data":"05db66b0dc7de4bfa95187e5d770a9a31889e2f313109c8456ea6674eaa67c2d"} Oct 08 22:36:02 crc kubenswrapper[4834]: I1008 22:36:02.063741 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-9hlt4" podStartSLOduration=2.163080651 podStartE2EDuration="4.063712283s" podCreationTimestamp="2025-10-08 22:35:58 +0000 UTC" firstStartedPulling="2025-10-08 22:35:59.628227138 +0000 UTC m=+767.451111884" lastFinishedPulling="2025-10-08 22:36:01.52885876 +0000 UTC m=+769.351743516" observedRunningTime="2025-10-08 22:36:02.059104809 +0000 UTC m=+769.881989555" watchObservedRunningTime="2025-10-08 22:36:02.063712283 +0000 UTC m=+769.886597069" Oct 08 22:36:02 crc kubenswrapper[4834]: I1008 22:36:02.991231 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-phk7h"] Oct 08 22:36:02 crc kubenswrapper[4834]: I1008 22:36:02.992603 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-phk7h" Oct 08 22:36:02 crc kubenswrapper[4834]: I1008 22:36:02.994867 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-qw8x2" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.024112 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-phk7h"] Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.037371 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txgmq\" (UniqueName: \"kubernetes.io/projected/ab078381-fb7e-4f3b-959e-bca22362c6bc-kube-api-access-txgmq\") pod \"nmstate-metrics-fdff9cb8d-phk7h\" (UID: \"ab078381-fb7e-4f3b-959e-bca22362c6bc\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-phk7h" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.055420 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-7nbqk"] Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.056380 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct"] Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.056569 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.056999 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.058657 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.067046 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct"] Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.138878 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txgmq\" (UniqueName: \"kubernetes.io/projected/ab078381-fb7e-4f3b-959e-bca22362c6bc-kube-api-access-txgmq\") pod \"nmstate-metrics-fdff9cb8d-phk7h\" (UID: \"ab078381-fb7e-4f3b-959e-bca22362c6bc\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-phk7h" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.148489 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj"] Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.149112 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.152231 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-4gzlr" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.153061 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.153728 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.165386 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj"] Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.167085 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txgmq\" (UniqueName: \"kubernetes.io/projected/ab078381-fb7e-4f3b-959e-bca22362c6bc-kube-api-access-txgmq\") pod \"nmstate-metrics-fdff9cb8d-phk7h\" (UID: \"ab078381-fb7e-4f3b-959e-bca22362c6bc\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-phk7h" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.240383 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhbl6\" (UniqueName: \"kubernetes.io/projected/421a6435-345f-4452-a180-93038948456a-kube-api-access-nhbl6\") pod \"nmstate-webhook-6cdbc54649-dmfct\" (UID: \"421a6435-345f-4452-a180-93038948456a\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.240781 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/421a6435-345f-4452-a180-93038948456a-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-dmfct\" (UID: \"421a6435-345f-4452-a180-93038948456a\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.240821 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/199673d0-23b5-446b-a8c0-f7e49043acc8-nmstate-lock\") pod \"nmstate-handler-7nbqk\" (UID: \"199673d0-23b5-446b-a8c0-f7e49043acc8\") " pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.240861 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/199673d0-23b5-446b-a8c0-f7e49043acc8-ovs-socket\") pod \"nmstate-handler-7nbqk\" (UID: \"199673d0-23b5-446b-a8c0-f7e49043acc8\") " pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.240945 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/199673d0-23b5-446b-a8c0-f7e49043acc8-dbus-socket\") pod \"nmstate-handler-7nbqk\" (UID: \"199673d0-23b5-446b-a8c0-f7e49043acc8\") " pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.240989 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw4jg\" (UniqueName: \"kubernetes.io/projected/199673d0-23b5-446b-a8c0-f7e49043acc8-kube-api-access-fw4jg\") pod \"nmstate-handler-7nbqk\" (UID: \"199673d0-23b5-446b-a8c0-f7e49043acc8\") " pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.331588 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-phk7h" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.340240 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-89c65d5fc-4697f"] Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.340919 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.341810 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/421a6435-345f-4452-a180-93038948456a-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-dmfct\" (UID: \"421a6435-345f-4452-a180-93038948456a\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.341867 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9b787c76-801d-4aa7-ad57-24eb4e4a1232-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-g2lcj\" (UID: \"9b787c76-801d-4aa7-ad57-24eb4e4a1232\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.341894 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/199673d0-23b5-446b-a8c0-f7e49043acc8-nmstate-lock\") pod \"nmstate-handler-7nbqk\" (UID: \"199673d0-23b5-446b-a8c0-f7e49043acc8\") " pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.341911 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/199673d0-23b5-446b-a8c0-f7e49043acc8-ovs-socket\") pod \"nmstate-handler-7nbqk\" (UID: \"199673d0-23b5-446b-a8c0-f7e49043acc8\") " pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.341932 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b787c76-801d-4aa7-ad57-24eb4e4a1232-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-g2lcj\" (UID: \"9b787c76-801d-4aa7-ad57-24eb4e4a1232\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.341973 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/199673d0-23b5-446b-a8c0-f7e49043acc8-dbus-socket\") pod \"nmstate-handler-7nbqk\" (UID: \"199673d0-23b5-446b-a8c0-f7e49043acc8\") " pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.341998 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw4jg\" (UniqueName: \"kubernetes.io/projected/199673d0-23b5-446b-a8c0-f7e49043acc8-kube-api-access-fw4jg\") pod \"nmstate-handler-7nbqk\" (UID: \"199673d0-23b5-446b-a8c0-f7e49043acc8\") " pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.342033 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz52w\" (UniqueName: \"kubernetes.io/projected/9b787c76-801d-4aa7-ad57-24eb4e4a1232-kube-api-access-hz52w\") pod \"nmstate-console-plugin-6b874cbd85-g2lcj\" (UID: \"9b787c76-801d-4aa7-ad57-24eb4e4a1232\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.342055 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhbl6\" (UniqueName: \"kubernetes.io/projected/421a6435-345f-4452-a180-93038948456a-kube-api-access-nhbl6\") pod \"nmstate-webhook-6cdbc54649-dmfct\" (UID: \"421a6435-345f-4452-a180-93038948456a\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.342700 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/199673d0-23b5-446b-a8c0-f7e49043acc8-nmstate-lock\") pod \"nmstate-handler-7nbqk\" (UID: \"199673d0-23b5-446b-a8c0-f7e49043acc8\") " pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.342791 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/199673d0-23b5-446b-a8c0-f7e49043acc8-ovs-socket\") pod \"nmstate-handler-7nbqk\" (UID: \"199673d0-23b5-446b-a8c0-f7e49043acc8\") " pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.343533 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/199673d0-23b5-446b-a8c0-f7e49043acc8-dbus-socket\") pod \"nmstate-handler-7nbqk\" (UID: \"199673d0-23b5-446b-a8c0-f7e49043acc8\") " pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.349215 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/421a6435-345f-4452-a180-93038948456a-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-dmfct\" (UID: \"421a6435-345f-4452-a180-93038948456a\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.368124 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-89c65d5fc-4697f"] Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.372133 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw4jg\" (UniqueName: \"kubernetes.io/projected/199673d0-23b5-446b-a8c0-f7e49043acc8-kube-api-access-fw4jg\") pod \"nmstate-handler-7nbqk\" (UID: \"199673d0-23b5-446b-a8c0-f7e49043acc8\") " pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.385728 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhbl6\" (UniqueName: \"kubernetes.io/projected/421a6435-345f-4452-a180-93038948456a-kube-api-access-nhbl6\") pod \"nmstate-webhook-6cdbc54649-dmfct\" (UID: \"421a6435-345f-4452-a180-93038948456a\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.390392 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.400809 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.443000 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b787c76-801d-4aa7-ad57-24eb4e4a1232-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-g2lcj\" (UID: \"9b787c76-801d-4aa7-ad57-24eb4e4a1232\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.443052 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5dc1f57-2762-4227-8ea4-e4bd70db699d-oauth-serving-cert\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.443080 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5dc1f57-2762-4227-8ea4-e4bd70db699d-console-serving-cert\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.443111 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5dc1f57-2762-4227-8ea4-e4bd70db699d-console-oauth-config\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.443139 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9mx9\" (UniqueName: \"kubernetes.io/projected/c5dc1f57-2762-4227-8ea4-e4bd70db699d-kube-api-access-w9mx9\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.443170 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5dc1f57-2762-4227-8ea4-e4bd70db699d-trusted-ca-bundle\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.443193 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz52w\" (UniqueName: \"kubernetes.io/projected/9b787c76-801d-4aa7-ad57-24eb4e4a1232-kube-api-access-hz52w\") pod \"nmstate-console-plugin-6b874cbd85-g2lcj\" (UID: \"9b787c76-801d-4aa7-ad57-24eb4e4a1232\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.443224 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5dc1f57-2762-4227-8ea4-e4bd70db699d-service-ca\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.443242 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9b787c76-801d-4aa7-ad57-24eb4e4a1232-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-g2lcj\" (UID: \"9b787c76-801d-4aa7-ad57-24eb4e4a1232\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.443259 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5dc1f57-2762-4227-8ea4-e4bd70db699d-console-config\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.447904 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9b787c76-801d-4aa7-ad57-24eb4e4a1232-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-g2lcj\" (UID: \"9b787c76-801d-4aa7-ad57-24eb4e4a1232\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.450274 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b787c76-801d-4aa7-ad57-24eb4e4a1232-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-g2lcj\" (UID: \"9b787c76-801d-4aa7-ad57-24eb4e4a1232\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.476695 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz52w\" (UniqueName: \"kubernetes.io/projected/9b787c76-801d-4aa7-ad57-24eb4e4a1232-kube-api-access-hz52w\") pod \"nmstate-console-plugin-6b874cbd85-g2lcj\" (UID: \"9b787c76-801d-4aa7-ad57-24eb4e4a1232\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.544639 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5dc1f57-2762-4227-8ea4-e4bd70db699d-trusted-ca-bundle\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.544680 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9mx9\" (UniqueName: \"kubernetes.io/projected/c5dc1f57-2762-4227-8ea4-e4bd70db699d-kube-api-access-w9mx9\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.544719 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5dc1f57-2762-4227-8ea4-e4bd70db699d-service-ca\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.544740 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5dc1f57-2762-4227-8ea4-e4bd70db699d-console-config\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.544766 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5dc1f57-2762-4227-8ea4-e4bd70db699d-oauth-serving-cert\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.544811 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5dc1f57-2762-4227-8ea4-e4bd70db699d-console-serving-cert\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.544839 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5dc1f57-2762-4227-8ea4-e4bd70db699d-console-oauth-config\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.546167 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5dc1f57-2762-4227-8ea4-e4bd70db699d-service-ca\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.546573 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5dc1f57-2762-4227-8ea4-e4bd70db699d-oauth-serving-cert\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.546953 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5dc1f57-2762-4227-8ea4-e4bd70db699d-trusted-ca-bundle\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.547875 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5dc1f57-2762-4227-8ea4-e4bd70db699d-console-config\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.549236 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5dc1f57-2762-4227-8ea4-e4bd70db699d-console-oauth-config\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.551666 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5dc1f57-2762-4227-8ea4-e4bd70db699d-console-serving-cert\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.561872 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9mx9\" (UniqueName: \"kubernetes.io/projected/c5dc1f57-2762-4227-8ea4-e4bd70db699d-kube-api-access-w9mx9\") pod \"console-89c65d5fc-4697f\" (UID: \"c5dc1f57-2762-4227-8ea4-e4bd70db699d\") " pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.728285 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.765281 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj" Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.805974 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-phk7h"] Oct 08 22:36:03 crc kubenswrapper[4834]: W1008 22:36:03.811253 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab078381_fb7e_4f3b_959e_bca22362c6bc.slice/crio-f6f757044d660d36ee38c213bac6694dc8c0e34c8ed51f959eb26ebdc1674729 WatchSource:0}: Error finding container f6f757044d660d36ee38c213bac6694dc8c0e34c8ed51f959eb26ebdc1674729: Status 404 returned error can't find the container with id f6f757044d660d36ee38c213bac6694dc8c0e34c8ed51f959eb26ebdc1674729 Oct 08 22:36:03 crc kubenswrapper[4834]: I1008 22:36:03.889054 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct"] Oct 08 22:36:03 crc kubenswrapper[4834]: W1008 22:36:03.898311 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod421a6435_345f_4452_a180_93038948456a.slice/crio-8e06c0a68ad4516d1d4a2874fb8bb729c7237088f43d619076320dd5266fdfb7 WatchSource:0}: Error finding container 8e06c0a68ad4516d1d4a2874fb8bb729c7237088f43d619076320dd5266fdfb7: Status 404 returned error can't find the container with id 8e06c0a68ad4516d1d4a2874fb8bb729c7237088f43d619076320dd5266fdfb7 Oct 08 22:36:04 crc kubenswrapper[4834]: I1008 22:36:04.044912 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7nbqk" event={"ID":"199673d0-23b5-446b-a8c0-f7e49043acc8","Type":"ContainerStarted","Data":"b868b53be15536ebaf076c162f40b3a076bf172f73222b8f5b31b30f82db2b1a"} Oct 08 22:36:04 crc kubenswrapper[4834]: I1008 22:36:04.047688 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-phk7h" event={"ID":"ab078381-fb7e-4f3b-959e-bca22362c6bc","Type":"ContainerStarted","Data":"f6f757044d660d36ee38c213bac6694dc8c0e34c8ed51f959eb26ebdc1674729"} Oct 08 22:36:04 crc kubenswrapper[4834]: I1008 22:36:04.048451 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct" event={"ID":"421a6435-345f-4452-a180-93038948456a","Type":"ContainerStarted","Data":"8e06c0a68ad4516d1d4a2874fb8bb729c7237088f43d619076320dd5266fdfb7"} Oct 08 22:36:04 crc kubenswrapper[4834]: I1008 22:36:04.217352 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-89c65d5fc-4697f"] Oct 08 22:36:04 crc kubenswrapper[4834]: W1008 22:36:04.226139 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5dc1f57_2762_4227_8ea4_e4bd70db699d.slice/crio-2d36778b5dae4d3b5ef173ba3cbcc9237152d86da9769700b2cfcb9aaf06a029 WatchSource:0}: Error finding container 2d36778b5dae4d3b5ef173ba3cbcc9237152d86da9769700b2cfcb9aaf06a029: Status 404 returned error can't find the container with id 2d36778b5dae4d3b5ef173ba3cbcc9237152d86da9769700b2cfcb9aaf06a029 Oct 08 22:36:04 crc kubenswrapper[4834]: I1008 22:36:04.276081 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj"] Oct 08 22:36:04 crc kubenswrapper[4834]: W1008 22:36:04.281990 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b787c76_801d_4aa7_ad57_24eb4e4a1232.slice/crio-a64b5343088482d5fd293b4e54abcd63f856c5493aab5f48699ee82af6d7cee3 WatchSource:0}: Error finding container a64b5343088482d5fd293b4e54abcd63f856c5493aab5f48699ee82af6d7cee3: Status 404 returned error can't find the container with id a64b5343088482d5fd293b4e54abcd63f856c5493aab5f48699ee82af6d7cee3 Oct 08 22:36:05 crc kubenswrapper[4834]: I1008 22:36:05.062636 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-89c65d5fc-4697f" event={"ID":"c5dc1f57-2762-4227-8ea4-e4bd70db699d","Type":"ContainerStarted","Data":"004bc6313083276f3cafa0a5cc0f9d075dee56e4c789310004ee03c164da27ae"} Oct 08 22:36:05 crc kubenswrapper[4834]: I1008 22:36:05.063063 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-89c65d5fc-4697f" event={"ID":"c5dc1f57-2762-4227-8ea4-e4bd70db699d","Type":"ContainerStarted","Data":"2d36778b5dae4d3b5ef173ba3cbcc9237152d86da9769700b2cfcb9aaf06a029"} Oct 08 22:36:05 crc kubenswrapper[4834]: I1008 22:36:05.065701 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj" event={"ID":"9b787c76-801d-4aa7-ad57-24eb4e4a1232","Type":"ContainerStarted","Data":"a64b5343088482d5fd293b4e54abcd63f856c5493aab5f48699ee82af6d7cee3"} Oct 08 22:36:05 crc kubenswrapper[4834]: I1008 22:36:05.092786 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-89c65d5fc-4697f" podStartSLOduration=2.092754283 podStartE2EDuration="2.092754283s" podCreationTimestamp="2025-10-08 22:36:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:36:05.086578492 +0000 UTC m=+772.909463258" watchObservedRunningTime="2025-10-08 22:36:05.092754283 +0000 UTC m=+772.915639049" Oct 08 22:36:07 crc kubenswrapper[4834]: I1008 22:36:07.098096 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-phk7h" event={"ID":"ab078381-fb7e-4f3b-959e-bca22362c6bc","Type":"ContainerStarted","Data":"076cbb99b07aa1e6e6a24e0e2ad5cca571b22be3626e8d8256f3e6c3a0a66d21"} Oct 08 22:36:07 crc kubenswrapper[4834]: I1008 22:36:07.100242 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct" event={"ID":"421a6435-345f-4452-a180-93038948456a","Type":"ContainerStarted","Data":"182c0fd698dec57c87932ca2959a93dd7dacadd15d76a78492e7256f67f7f18c"} Oct 08 22:36:07 crc kubenswrapper[4834]: I1008 22:36:07.101105 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct" Oct 08 22:36:07 crc kubenswrapper[4834]: I1008 22:36:07.120649 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct" podStartSLOduration=1.184976531 podStartE2EDuration="4.120628358s" podCreationTimestamp="2025-10-08 22:36:03 +0000 UTC" firstStartedPulling="2025-10-08 22:36:03.901783577 +0000 UTC m=+771.724668323" lastFinishedPulling="2025-10-08 22:36:06.837435374 +0000 UTC m=+774.660320150" observedRunningTime="2025-10-08 22:36:07.12026446 +0000 UTC m=+774.943149246" watchObservedRunningTime="2025-10-08 22:36:07.120628358 +0000 UTC m=+774.943513114" Oct 08 22:36:08 crc kubenswrapper[4834]: I1008 22:36:08.122209 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7nbqk" event={"ID":"199673d0-23b5-446b-a8c0-f7e49043acc8","Type":"ContainerStarted","Data":"78b08696e38620788588266a8d9f8a528509837a328f99656d647ce9b515c05f"} Oct 08 22:36:08 crc kubenswrapper[4834]: I1008 22:36:08.122763 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:08 crc kubenswrapper[4834]: I1008 22:36:08.147648 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-7nbqk" podStartSLOduration=1.756821594 podStartE2EDuration="5.147629357s" podCreationTimestamp="2025-10-08 22:36:03 +0000 UTC" firstStartedPulling="2025-10-08 22:36:03.418388218 +0000 UTC m=+771.241272964" lastFinishedPulling="2025-10-08 22:36:06.809195941 +0000 UTC m=+774.632080727" observedRunningTime="2025-10-08 22:36:08.145775071 +0000 UTC m=+775.968659827" watchObservedRunningTime="2025-10-08 22:36:08.147629357 +0000 UTC m=+775.970514113" Oct 08 22:36:09 crc kubenswrapper[4834]: I1008 22:36:09.131163 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj" event={"ID":"9b787c76-801d-4aa7-ad57-24eb4e4a1232","Type":"ContainerStarted","Data":"c6fb615903111e97d6485a390b72996e6fe13653ab16569b4de4ac70d61a6ea4"} Oct 08 22:36:09 crc kubenswrapper[4834]: I1008 22:36:09.150242 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-g2lcj" podStartSLOduration=2.455625994 podStartE2EDuration="6.150194736s" podCreationTimestamp="2025-10-08 22:36:03 +0000 UTC" firstStartedPulling="2025-10-08 22:36:04.284946917 +0000 UTC m=+772.107831693" lastFinishedPulling="2025-10-08 22:36:07.979515689 +0000 UTC m=+775.802400435" observedRunningTime="2025-10-08 22:36:09.149326934 +0000 UTC m=+776.972211690" watchObservedRunningTime="2025-10-08 22:36:09.150194736 +0000 UTC m=+776.973079492" Oct 08 22:36:10 crc kubenswrapper[4834]: I1008 22:36:10.141200 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-phk7h" event={"ID":"ab078381-fb7e-4f3b-959e-bca22362c6bc","Type":"ContainerStarted","Data":"719b8bdcd67f4903e4a614f15bd0cd81450a3aba0e1afb05e7cb3f863123305f"} Oct 08 22:36:10 crc kubenswrapper[4834]: I1008 22:36:10.180900 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-phk7h" podStartSLOduration=2.7778017889999997 podStartE2EDuration="8.180872205s" podCreationTimestamp="2025-10-08 22:36:02 +0000 UTC" firstStartedPulling="2025-10-08 22:36:03.814621837 +0000 UTC m=+771.637506583" lastFinishedPulling="2025-10-08 22:36:09.217692233 +0000 UTC m=+777.040576999" observedRunningTime="2025-10-08 22:36:10.165657481 +0000 UTC m=+777.988542277" watchObservedRunningTime="2025-10-08 22:36:10.180872205 +0000 UTC m=+778.003756991" Oct 08 22:36:13 crc kubenswrapper[4834]: I1008 22:36:13.427504 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-7nbqk" Oct 08 22:36:13 crc kubenswrapper[4834]: I1008 22:36:13.727633 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:13 crc kubenswrapper[4834]: I1008 22:36:13.728064 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:13 crc kubenswrapper[4834]: I1008 22:36:13.733384 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:14 crc kubenswrapper[4834]: I1008 22:36:14.178877 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-89c65d5fc-4697f" Oct 08 22:36:14 crc kubenswrapper[4834]: I1008 22:36:14.255584 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lbmrk"] Oct 08 22:36:17 crc kubenswrapper[4834]: I1008 22:36:17.026380 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:36:17 crc kubenswrapper[4834]: I1008 22:36:17.026877 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:36:17 crc kubenswrapper[4834]: I1008 22:36:17.027005 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:36:17 crc kubenswrapper[4834]: I1008 22:36:17.028104 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52a93a6fe63650f6a03b209142df1e3ce01f805d96d6142f73c9c419354c0aca"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:36:17 crc kubenswrapper[4834]: I1008 22:36:17.028243 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://52a93a6fe63650f6a03b209142df1e3ce01f805d96d6142f73c9c419354c0aca" gracePeriod=600 Oct 08 22:36:17 crc kubenswrapper[4834]: I1008 22:36:17.203549 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="52a93a6fe63650f6a03b209142df1e3ce01f805d96d6142f73c9c419354c0aca" exitCode=0 Oct 08 22:36:17 crc kubenswrapper[4834]: I1008 22:36:17.203658 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"52a93a6fe63650f6a03b209142df1e3ce01f805d96d6142f73c9c419354c0aca"} Oct 08 22:36:17 crc kubenswrapper[4834]: I1008 22:36:17.203735 4834 scope.go:117] "RemoveContainer" containerID="b487a4d9d655435511ba5dee397c544c606f8e8cfe6bd53b6020e82a7748e90e" Oct 08 22:36:18 crc kubenswrapper[4834]: I1008 22:36:18.216782 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"c4baa6db4e38cdb0c141b99f90c2bb4b5f7f47f94b55109a60fc26c2c73b21d9"} Oct 08 22:36:23 crc kubenswrapper[4834]: I1008 22:36:23.411400 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-dmfct" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.312370 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-lbmrk" podUID="c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" containerName="console" containerID="cri-o://cb0e1c9c31bce6e93b57ec5d97399411a4443fe383179666e9a4287f7b0f9940" gracePeriod=15 Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.797609 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lbmrk_c5fef273-e7c8-4cb1-98a9-fe3359ad7c19/console/0.log" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.798014 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.817579 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-oauth-serving-cert\") pod \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.817654 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk447\" (UniqueName: \"kubernetes.io/projected/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-kube-api-access-bk447\") pod \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.817687 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-trusted-ca-bundle\") pod \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.817746 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-service-ca\") pod \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.817766 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-config\") pod \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.817830 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-serving-cert\") pod \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.817867 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-oauth-config\") pod \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\" (UID: \"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19\") " Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.820475 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" (UID: "c5fef273-e7c8-4cb1-98a9-fe3359ad7c19"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.820931 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" (UID: "c5fef273-e7c8-4cb1-98a9-fe3359ad7c19"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.822478 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-service-ca" (OuterVolumeSpecName: "service-ca") pod "c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" (UID: "c5fef273-e7c8-4cb1-98a9-fe3359ad7c19"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.822857 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-config" (OuterVolumeSpecName: "console-config") pod "c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" (UID: "c5fef273-e7c8-4cb1-98a9-fe3359ad7c19"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.870345 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" (UID: "c5fef273-e7c8-4cb1-98a9-fe3359ad7c19"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.870847 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" (UID: "c5fef273-e7c8-4cb1-98a9-fe3359ad7c19"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.871085 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-kube-api-access-bk447" (OuterVolumeSpecName: "kube-api-access-bk447") pod "c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" (UID: "c5fef273-e7c8-4cb1-98a9-fe3359ad7c19"). InnerVolumeSpecName "kube-api-access-bk447". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.919718 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk447\" (UniqueName: \"kubernetes.io/projected/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-kube-api-access-bk447\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.920064 4834 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.920169 4834 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-service-ca\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.920257 4834 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.920333 4834 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.920434 4834 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:39 crc kubenswrapper[4834]: I1008 22:36:39.920511 4834 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:40 crc kubenswrapper[4834]: I1008 22:36:40.378013 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lbmrk_c5fef273-e7c8-4cb1-98a9-fe3359ad7c19/console/0.log" Oct 08 22:36:40 crc kubenswrapper[4834]: I1008 22:36:40.378062 4834 generic.go:334] "Generic (PLEG): container finished" podID="c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" containerID="cb0e1c9c31bce6e93b57ec5d97399411a4443fe383179666e9a4287f7b0f9940" exitCode=2 Oct 08 22:36:40 crc kubenswrapper[4834]: I1008 22:36:40.378092 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lbmrk" event={"ID":"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19","Type":"ContainerDied","Data":"cb0e1c9c31bce6e93b57ec5d97399411a4443fe383179666e9a4287f7b0f9940"} Oct 08 22:36:40 crc kubenswrapper[4834]: I1008 22:36:40.378122 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lbmrk" event={"ID":"c5fef273-e7c8-4cb1-98a9-fe3359ad7c19","Type":"ContainerDied","Data":"4e740e33c7a64d475a0325275c391b4a2d90c7eb9e572b001e2b08face1a1518"} Oct 08 22:36:40 crc kubenswrapper[4834]: I1008 22:36:40.378158 4834 scope.go:117] "RemoveContainer" containerID="cb0e1c9c31bce6e93b57ec5d97399411a4443fe383179666e9a4287f7b0f9940" Oct 08 22:36:40 crc kubenswrapper[4834]: I1008 22:36:40.378272 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lbmrk" Oct 08 22:36:40 crc kubenswrapper[4834]: I1008 22:36:40.401026 4834 scope.go:117] "RemoveContainer" containerID="cb0e1c9c31bce6e93b57ec5d97399411a4443fe383179666e9a4287f7b0f9940" Oct 08 22:36:40 crc kubenswrapper[4834]: E1008 22:36:40.402460 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0e1c9c31bce6e93b57ec5d97399411a4443fe383179666e9a4287f7b0f9940\": container with ID starting with cb0e1c9c31bce6e93b57ec5d97399411a4443fe383179666e9a4287f7b0f9940 not found: ID does not exist" containerID="cb0e1c9c31bce6e93b57ec5d97399411a4443fe383179666e9a4287f7b0f9940" Oct 08 22:36:40 crc kubenswrapper[4834]: I1008 22:36:40.402535 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0e1c9c31bce6e93b57ec5d97399411a4443fe383179666e9a4287f7b0f9940"} err="failed to get container status \"cb0e1c9c31bce6e93b57ec5d97399411a4443fe383179666e9a4287f7b0f9940\": rpc error: code = NotFound desc = could not find container \"cb0e1c9c31bce6e93b57ec5d97399411a4443fe383179666e9a4287f7b0f9940\": container with ID starting with cb0e1c9c31bce6e93b57ec5d97399411a4443fe383179666e9a4287f7b0f9940 not found: ID does not exist" Oct 08 22:36:40 crc kubenswrapper[4834]: I1008 22:36:40.422785 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lbmrk"] Oct 08 22:36:40 crc kubenswrapper[4834]: I1008 22:36:40.432138 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-lbmrk"] Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.395223 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q"] Oct 08 22:36:41 crc kubenswrapper[4834]: E1008 22:36:41.395499 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" containerName="console" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.395518 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" containerName="console" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.395665 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" containerName="console" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.396555 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.400940 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.428806 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q"] Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.440873 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7d60134-60c3-498e-9550-fafb7900fcf1-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q\" (UID: \"f7d60134-60c3-498e-9550-fafb7900fcf1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.441112 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7d60134-60c3-498e-9550-fafb7900fcf1-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q\" (UID: \"f7d60134-60c3-498e-9550-fafb7900fcf1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.441395 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6rps\" (UniqueName: \"kubernetes.io/projected/f7d60134-60c3-498e-9550-fafb7900fcf1-kube-api-access-p6rps\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q\" (UID: \"f7d60134-60c3-498e-9550-fafb7900fcf1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.543862 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7d60134-60c3-498e-9550-fafb7900fcf1-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q\" (UID: \"f7d60134-60c3-498e-9550-fafb7900fcf1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.543928 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7d60134-60c3-498e-9550-fafb7900fcf1-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q\" (UID: \"f7d60134-60c3-498e-9550-fafb7900fcf1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.543989 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6rps\" (UniqueName: \"kubernetes.io/projected/f7d60134-60c3-498e-9550-fafb7900fcf1-kube-api-access-p6rps\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q\" (UID: \"f7d60134-60c3-498e-9550-fafb7900fcf1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.544577 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7d60134-60c3-498e-9550-fafb7900fcf1-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q\" (UID: \"f7d60134-60c3-498e-9550-fafb7900fcf1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.544901 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7d60134-60c3-498e-9550-fafb7900fcf1-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q\" (UID: \"f7d60134-60c3-498e-9550-fafb7900fcf1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.562541 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5fef273-e7c8-4cb1-98a9-fe3359ad7c19" path="/var/lib/kubelet/pods/c5fef273-e7c8-4cb1-98a9-fe3359ad7c19/volumes" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.563702 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6rps\" (UniqueName: \"kubernetes.io/projected/f7d60134-60c3-498e-9550-fafb7900fcf1-kube-api-access-p6rps\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q\" (UID: \"f7d60134-60c3-498e-9550-fafb7900fcf1\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" Oct 08 22:36:41 crc kubenswrapper[4834]: I1008 22:36:41.728101 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" Oct 08 22:36:42 crc kubenswrapper[4834]: I1008 22:36:42.015220 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q"] Oct 08 22:36:42 crc kubenswrapper[4834]: I1008 22:36:42.398000 4834 generic.go:334] "Generic (PLEG): container finished" podID="f7d60134-60c3-498e-9550-fafb7900fcf1" containerID="7c79763ea6b7e389cca736dd6eab54b60c215d0abbe7f6d2445f867ef71f2162" exitCode=0 Oct 08 22:36:42 crc kubenswrapper[4834]: I1008 22:36:42.398097 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" event={"ID":"f7d60134-60c3-498e-9550-fafb7900fcf1","Type":"ContainerDied","Data":"7c79763ea6b7e389cca736dd6eab54b60c215d0abbe7f6d2445f867ef71f2162"} Oct 08 22:36:42 crc kubenswrapper[4834]: I1008 22:36:42.398618 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" event={"ID":"f7d60134-60c3-498e-9550-fafb7900fcf1","Type":"ContainerStarted","Data":"9a474c3367792ef91fdbc974048b5899aa865d220ba3dc2dd105d4839c067446"} Oct 08 22:36:43 crc kubenswrapper[4834]: I1008 22:36:43.661272 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l67w9"] Oct 08 22:36:43 crc kubenswrapper[4834]: I1008 22:36:43.664228 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:43 crc kubenswrapper[4834]: I1008 22:36:43.676518 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tx5j\" (UniqueName: \"kubernetes.io/projected/8c8e1e17-2b8b-4b54-87a9-acaf42412344-kube-api-access-6tx5j\") pod \"redhat-operators-l67w9\" (UID: \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\") " pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:43 crc kubenswrapper[4834]: I1008 22:36:43.676608 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c8e1e17-2b8b-4b54-87a9-acaf42412344-utilities\") pod \"redhat-operators-l67w9\" (UID: \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\") " pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:43 crc kubenswrapper[4834]: I1008 22:36:43.676696 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c8e1e17-2b8b-4b54-87a9-acaf42412344-catalog-content\") pod \"redhat-operators-l67w9\" (UID: \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\") " pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:43 crc kubenswrapper[4834]: I1008 22:36:43.681533 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l67w9"] Oct 08 22:36:43 crc kubenswrapper[4834]: I1008 22:36:43.778295 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c8e1e17-2b8b-4b54-87a9-acaf42412344-catalog-content\") pod \"redhat-operators-l67w9\" (UID: \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\") " pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:43 crc kubenswrapper[4834]: I1008 22:36:43.778458 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tx5j\" (UniqueName: \"kubernetes.io/projected/8c8e1e17-2b8b-4b54-87a9-acaf42412344-kube-api-access-6tx5j\") pod \"redhat-operators-l67w9\" (UID: \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\") " pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:43 crc kubenswrapper[4834]: I1008 22:36:43.778495 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c8e1e17-2b8b-4b54-87a9-acaf42412344-utilities\") pod \"redhat-operators-l67w9\" (UID: \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\") " pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:43 crc kubenswrapper[4834]: I1008 22:36:43.779061 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c8e1e17-2b8b-4b54-87a9-acaf42412344-catalog-content\") pod \"redhat-operators-l67w9\" (UID: \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\") " pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:43 crc kubenswrapper[4834]: I1008 22:36:43.779270 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c8e1e17-2b8b-4b54-87a9-acaf42412344-utilities\") pod \"redhat-operators-l67w9\" (UID: \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\") " pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:43 crc kubenswrapper[4834]: I1008 22:36:43.802306 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tx5j\" (UniqueName: \"kubernetes.io/projected/8c8e1e17-2b8b-4b54-87a9-acaf42412344-kube-api-access-6tx5j\") pod \"redhat-operators-l67w9\" (UID: \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\") " pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:44 crc kubenswrapper[4834]: I1008 22:36:44.004521 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:44 crc kubenswrapper[4834]: I1008 22:36:44.413701 4834 generic.go:334] "Generic (PLEG): container finished" podID="f7d60134-60c3-498e-9550-fafb7900fcf1" containerID="636876fa033e3e524fa56b9fa4b1558ac7dabea24a6889b389d85cee4460f695" exitCode=0 Oct 08 22:36:44 crc kubenswrapper[4834]: I1008 22:36:44.413761 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" event={"ID":"f7d60134-60c3-498e-9550-fafb7900fcf1","Type":"ContainerDied","Data":"636876fa033e3e524fa56b9fa4b1558ac7dabea24a6889b389d85cee4460f695"} Oct 08 22:36:44 crc kubenswrapper[4834]: I1008 22:36:44.469393 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l67w9"] Oct 08 22:36:44 crc kubenswrapper[4834]: W1008 22:36:44.476172 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c8e1e17_2b8b_4b54_87a9_acaf42412344.slice/crio-be948157c0d29f3951b5ddef165a384d45b052611f75592d38bafd00e81019f8 WatchSource:0}: Error finding container be948157c0d29f3951b5ddef165a384d45b052611f75592d38bafd00e81019f8: Status 404 returned error can't find the container with id be948157c0d29f3951b5ddef165a384d45b052611f75592d38bafd00e81019f8 Oct 08 22:36:44 crc kubenswrapper[4834]: E1008 22:36:44.851002 4834 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7d60134_60c3_498e_9550_fafb7900fcf1.slice/crio-conmon-f752afe4a977c9dda0a04ba451d34d75e29586131e0fa198e8d3f4d4f15dcef4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7d60134_60c3_498e_9550_fafb7900fcf1.slice/crio-f752afe4a977c9dda0a04ba451d34d75e29586131e0fa198e8d3f4d4f15dcef4.scope\": RecentStats: unable to find data in memory cache]" Oct 08 22:36:45 crc kubenswrapper[4834]: I1008 22:36:45.423608 4834 generic.go:334] "Generic (PLEG): container finished" podID="8c8e1e17-2b8b-4b54-87a9-acaf42412344" containerID="786d9e6bfc5e39a0e225207c1a055e3dc0b38d8198ee3b450708bbd1c0f2e89b" exitCode=0 Oct 08 22:36:45 crc kubenswrapper[4834]: I1008 22:36:45.423694 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l67w9" event={"ID":"8c8e1e17-2b8b-4b54-87a9-acaf42412344","Type":"ContainerDied","Data":"786d9e6bfc5e39a0e225207c1a055e3dc0b38d8198ee3b450708bbd1c0f2e89b"} Oct 08 22:36:45 crc kubenswrapper[4834]: I1008 22:36:45.423728 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l67w9" event={"ID":"8c8e1e17-2b8b-4b54-87a9-acaf42412344","Type":"ContainerStarted","Data":"be948157c0d29f3951b5ddef165a384d45b052611f75592d38bafd00e81019f8"} Oct 08 22:36:45 crc kubenswrapper[4834]: I1008 22:36:45.427833 4834 generic.go:334] "Generic (PLEG): container finished" podID="f7d60134-60c3-498e-9550-fafb7900fcf1" containerID="f752afe4a977c9dda0a04ba451d34d75e29586131e0fa198e8d3f4d4f15dcef4" exitCode=0 Oct 08 22:36:45 crc kubenswrapper[4834]: I1008 22:36:45.427873 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" event={"ID":"f7d60134-60c3-498e-9550-fafb7900fcf1","Type":"ContainerDied","Data":"f752afe4a977c9dda0a04ba451d34d75e29586131e0fa198e8d3f4d4f15dcef4"} Oct 08 22:36:46 crc kubenswrapper[4834]: I1008 22:36:46.467876 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l67w9" event={"ID":"8c8e1e17-2b8b-4b54-87a9-acaf42412344","Type":"ContainerStarted","Data":"60aece7e31eef1d8b9125c1ba274308e1b3991e29fc57e002edeae15838b3d19"} Oct 08 22:36:46 crc kubenswrapper[4834]: I1008 22:36:46.798767 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" Oct 08 22:36:46 crc kubenswrapper[4834]: I1008 22:36:46.974539 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6rps\" (UniqueName: \"kubernetes.io/projected/f7d60134-60c3-498e-9550-fafb7900fcf1-kube-api-access-p6rps\") pod \"f7d60134-60c3-498e-9550-fafb7900fcf1\" (UID: \"f7d60134-60c3-498e-9550-fafb7900fcf1\") " Oct 08 22:36:46 crc kubenswrapper[4834]: I1008 22:36:46.974619 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7d60134-60c3-498e-9550-fafb7900fcf1-util\") pod \"f7d60134-60c3-498e-9550-fafb7900fcf1\" (UID: \"f7d60134-60c3-498e-9550-fafb7900fcf1\") " Oct 08 22:36:46 crc kubenswrapper[4834]: I1008 22:36:46.974674 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7d60134-60c3-498e-9550-fafb7900fcf1-bundle\") pod \"f7d60134-60c3-498e-9550-fafb7900fcf1\" (UID: \"f7d60134-60c3-498e-9550-fafb7900fcf1\") " Oct 08 22:36:46 crc kubenswrapper[4834]: I1008 22:36:46.976015 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7d60134-60c3-498e-9550-fafb7900fcf1-bundle" (OuterVolumeSpecName: "bundle") pod "f7d60134-60c3-498e-9550-fafb7900fcf1" (UID: "f7d60134-60c3-498e-9550-fafb7900fcf1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:36:46 crc kubenswrapper[4834]: I1008 22:36:46.984065 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7d60134-60c3-498e-9550-fafb7900fcf1-kube-api-access-p6rps" (OuterVolumeSpecName: "kube-api-access-p6rps") pod "f7d60134-60c3-498e-9550-fafb7900fcf1" (UID: "f7d60134-60c3-498e-9550-fafb7900fcf1"). InnerVolumeSpecName "kube-api-access-p6rps". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:36:46 crc kubenswrapper[4834]: I1008 22:36:46.990595 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7d60134-60c3-498e-9550-fafb7900fcf1-util" (OuterVolumeSpecName: "util") pod "f7d60134-60c3-498e-9550-fafb7900fcf1" (UID: "f7d60134-60c3-498e-9550-fafb7900fcf1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:36:47 crc kubenswrapper[4834]: I1008 22:36:47.076276 4834 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7d60134-60c3-498e-9550-fafb7900fcf1-util\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:47 crc kubenswrapper[4834]: I1008 22:36:47.076325 4834 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7d60134-60c3-498e-9550-fafb7900fcf1-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:47 crc kubenswrapper[4834]: I1008 22:36:47.076335 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6rps\" (UniqueName: \"kubernetes.io/projected/f7d60134-60c3-498e-9550-fafb7900fcf1-kube-api-access-p6rps\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:47 crc kubenswrapper[4834]: I1008 22:36:47.480695 4834 generic.go:334] "Generic (PLEG): container finished" podID="8c8e1e17-2b8b-4b54-87a9-acaf42412344" containerID="60aece7e31eef1d8b9125c1ba274308e1b3991e29fc57e002edeae15838b3d19" exitCode=0 Oct 08 22:36:47 crc kubenswrapper[4834]: I1008 22:36:47.480823 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l67w9" event={"ID":"8c8e1e17-2b8b-4b54-87a9-acaf42412344","Type":"ContainerDied","Data":"60aece7e31eef1d8b9125c1ba274308e1b3991e29fc57e002edeae15838b3d19"} Oct 08 22:36:47 crc kubenswrapper[4834]: I1008 22:36:47.484743 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" event={"ID":"f7d60134-60c3-498e-9550-fafb7900fcf1","Type":"ContainerDied","Data":"9a474c3367792ef91fdbc974048b5899aa865d220ba3dc2dd105d4839c067446"} Oct 08 22:36:47 crc kubenswrapper[4834]: I1008 22:36:47.484795 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a474c3367792ef91fdbc974048b5899aa865d220ba3dc2dd105d4839c067446" Oct 08 22:36:47 crc kubenswrapper[4834]: I1008 22:36:47.484859 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q" Oct 08 22:36:48 crc kubenswrapper[4834]: I1008 22:36:48.496611 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l67w9" event={"ID":"8c8e1e17-2b8b-4b54-87a9-acaf42412344","Type":"ContainerStarted","Data":"0cd071c2cedaa7fe31d8ecca93a1ab3d63ca8f67ac8eb3f5bee581447f93f84d"} Oct 08 22:36:54 crc kubenswrapper[4834]: I1008 22:36:54.005801 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:54 crc kubenswrapper[4834]: I1008 22:36:54.005874 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:54 crc kubenswrapper[4834]: I1008 22:36:54.082288 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:54 crc kubenswrapper[4834]: I1008 22:36:54.106477 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l67w9" podStartSLOduration=8.59199579 podStartE2EDuration="11.106457684s" podCreationTimestamp="2025-10-08 22:36:43 +0000 UTC" firstStartedPulling="2025-10-08 22:36:45.427675542 +0000 UTC m=+813.250560328" lastFinishedPulling="2025-10-08 22:36:47.942137436 +0000 UTC m=+815.765022222" observedRunningTime="2025-10-08 22:36:48.534516462 +0000 UTC m=+816.357401238" watchObservedRunningTime="2025-10-08 22:36:54.106457684 +0000 UTC m=+821.929342440" Oct 08 22:36:54 crc kubenswrapper[4834]: I1008 22:36:54.589162 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:56 crc kubenswrapper[4834]: I1008 22:36:56.058597 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l67w9"] Oct 08 22:36:56 crc kubenswrapper[4834]: I1008 22:36:56.544292 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l67w9" podUID="8c8e1e17-2b8b-4b54-87a9-acaf42412344" containerName="registry-server" containerID="cri-o://0cd071c2cedaa7fe31d8ecca93a1ab3d63ca8f67ac8eb3f5bee581447f93f84d" gracePeriod=2 Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.001933 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.029671 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tx5j\" (UniqueName: \"kubernetes.io/projected/8c8e1e17-2b8b-4b54-87a9-acaf42412344-kube-api-access-6tx5j\") pod \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\" (UID: \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\") " Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.029803 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c8e1e17-2b8b-4b54-87a9-acaf42412344-utilities\") pod \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\" (UID: \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\") " Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.029836 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c8e1e17-2b8b-4b54-87a9-acaf42412344-catalog-content\") pod \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\" (UID: \"8c8e1e17-2b8b-4b54-87a9-acaf42412344\") " Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.032651 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c8e1e17-2b8b-4b54-87a9-acaf42412344-utilities" (OuterVolumeSpecName: "utilities") pod "8c8e1e17-2b8b-4b54-87a9-acaf42412344" (UID: "8c8e1e17-2b8b-4b54-87a9-acaf42412344"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.071768 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8e1e17-2b8b-4b54-87a9-acaf42412344-kube-api-access-6tx5j" (OuterVolumeSpecName: "kube-api-access-6tx5j") pod "8c8e1e17-2b8b-4b54-87a9-acaf42412344" (UID: "8c8e1e17-2b8b-4b54-87a9-acaf42412344"). InnerVolumeSpecName "kube-api-access-6tx5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.131278 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tx5j\" (UniqueName: \"kubernetes.io/projected/8c8e1e17-2b8b-4b54-87a9-acaf42412344-kube-api-access-6tx5j\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.131307 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c8e1e17-2b8b-4b54-87a9-acaf42412344-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.554121 4834 generic.go:334] "Generic (PLEG): container finished" podID="8c8e1e17-2b8b-4b54-87a9-acaf42412344" containerID="0cd071c2cedaa7fe31d8ecca93a1ab3d63ca8f67ac8eb3f5bee581447f93f84d" exitCode=0 Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.554198 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l67w9" Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.554197 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l67w9" event={"ID":"8c8e1e17-2b8b-4b54-87a9-acaf42412344","Type":"ContainerDied","Data":"0cd071c2cedaa7fe31d8ecca93a1ab3d63ca8f67ac8eb3f5bee581447f93f84d"} Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.554274 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l67w9" event={"ID":"8c8e1e17-2b8b-4b54-87a9-acaf42412344","Type":"ContainerDied","Data":"be948157c0d29f3951b5ddef165a384d45b052611f75592d38bafd00e81019f8"} Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.554302 4834 scope.go:117] "RemoveContainer" containerID="0cd071c2cedaa7fe31d8ecca93a1ab3d63ca8f67ac8eb3f5bee581447f93f84d" Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.575664 4834 scope.go:117] "RemoveContainer" containerID="60aece7e31eef1d8b9125c1ba274308e1b3991e29fc57e002edeae15838b3d19" Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.599346 4834 scope.go:117] "RemoveContainer" containerID="786d9e6bfc5e39a0e225207c1a055e3dc0b38d8198ee3b450708bbd1c0f2e89b" Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.639667 4834 scope.go:117] "RemoveContainer" containerID="0cd071c2cedaa7fe31d8ecca93a1ab3d63ca8f67ac8eb3f5bee581447f93f84d" Oct 08 22:36:57 crc kubenswrapper[4834]: E1008 22:36:57.642077 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cd071c2cedaa7fe31d8ecca93a1ab3d63ca8f67ac8eb3f5bee581447f93f84d\": container with ID starting with 0cd071c2cedaa7fe31d8ecca93a1ab3d63ca8f67ac8eb3f5bee581447f93f84d not found: ID does not exist" containerID="0cd071c2cedaa7fe31d8ecca93a1ab3d63ca8f67ac8eb3f5bee581447f93f84d" Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.642165 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cd071c2cedaa7fe31d8ecca93a1ab3d63ca8f67ac8eb3f5bee581447f93f84d"} err="failed to get container status \"0cd071c2cedaa7fe31d8ecca93a1ab3d63ca8f67ac8eb3f5bee581447f93f84d\": rpc error: code = NotFound desc = could not find container \"0cd071c2cedaa7fe31d8ecca93a1ab3d63ca8f67ac8eb3f5bee581447f93f84d\": container with ID starting with 0cd071c2cedaa7fe31d8ecca93a1ab3d63ca8f67ac8eb3f5bee581447f93f84d not found: ID does not exist" Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.642211 4834 scope.go:117] "RemoveContainer" containerID="60aece7e31eef1d8b9125c1ba274308e1b3991e29fc57e002edeae15838b3d19" Oct 08 22:36:57 crc kubenswrapper[4834]: E1008 22:36:57.642760 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60aece7e31eef1d8b9125c1ba274308e1b3991e29fc57e002edeae15838b3d19\": container with ID starting with 60aece7e31eef1d8b9125c1ba274308e1b3991e29fc57e002edeae15838b3d19 not found: ID does not exist" containerID="60aece7e31eef1d8b9125c1ba274308e1b3991e29fc57e002edeae15838b3d19" Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.642812 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60aece7e31eef1d8b9125c1ba274308e1b3991e29fc57e002edeae15838b3d19"} err="failed to get container status \"60aece7e31eef1d8b9125c1ba274308e1b3991e29fc57e002edeae15838b3d19\": rpc error: code = NotFound desc = could not find container \"60aece7e31eef1d8b9125c1ba274308e1b3991e29fc57e002edeae15838b3d19\": container with ID starting with 60aece7e31eef1d8b9125c1ba274308e1b3991e29fc57e002edeae15838b3d19 not found: ID does not exist" Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.642845 4834 scope.go:117] "RemoveContainer" containerID="786d9e6bfc5e39a0e225207c1a055e3dc0b38d8198ee3b450708bbd1c0f2e89b" Oct 08 22:36:57 crc kubenswrapper[4834]: E1008 22:36:57.643326 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"786d9e6bfc5e39a0e225207c1a055e3dc0b38d8198ee3b450708bbd1c0f2e89b\": container with ID starting with 786d9e6bfc5e39a0e225207c1a055e3dc0b38d8198ee3b450708bbd1c0f2e89b not found: ID does not exist" containerID="786d9e6bfc5e39a0e225207c1a055e3dc0b38d8198ee3b450708bbd1c0f2e89b" Oct 08 22:36:57 crc kubenswrapper[4834]: I1008 22:36:57.643362 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786d9e6bfc5e39a0e225207c1a055e3dc0b38d8198ee3b450708bbd1c0f2e89b"} err="failed to get container status \"786d9e6bfc5e39a0e225207c1a055e3dc0b38d8198ee3b450708bbd1c0f2e89b\": rpc error: code = NotFound desc = could not find container \"786d9e6bfc5e39a0e225207c1a055e3dc0b38d8198ee3b450708bbd1c0f2e89b\": container with ID starting with 786d9e6bfc5e39a0e225207c1a055e3dc0b38d8198ee3b450708bbd1c0f2e89b not found: ID does not exist" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.023542 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c8e1e17-2b8b-4b54-87a9-acaf42412344-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c8e1e17-2b8b-4b54-87a9-acaf42412344" (UID: "8c8e1e17-2b8b-4b54-87a9-acaf42412344"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.044007 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c8e1e17-2b8b-4b54-87a9-acaf42412344-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.215286 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l67w9"] Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.224298 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l67w9"] Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.380076 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs"] Oct 08 22:36:58 crc kubenswrapper[4834]: E1008 22:36:58.380366 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8e1e17-2b8b-4b54-87a9-acaf42412344" containerName="extract-content" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.380379 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8e1e17-2b8b-4b54-87a9-acaf42412344" containerName="extract-content" Oct 08 22:36:58 crc kubenswrapper[4834]: E1008 22:36:58.380389 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8e1e17-2b8b-4b54-87a9-acaf42412344" containerName="registry-server" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.380395 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8e1e17-2b8b-4b54-87a9-acaf42412344" containerName="registry-server" Oct 08 22:36:58 crc kubenswrapper[4834]: E1008 22:36:58.380404 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d60134-60c3-498e-9550-fafb7900fcf1" containerName="util" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.380410 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d60134-60c3-498e-9550-fafb7900fcf1" containerName="util" Oct 08 22:36:58 crc kubenswrapper[4834]: E1008 22:36:58.380424 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d60134-60c3-498e-9550-fafb7900fcf1" containerName="pull" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.380430 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d60134-60c3-498e-9550-fafb7900fcf1" containerName="pull" Oct 08 22:36:58 crc kubenswrapper[4834]: E1008 22:36:58.380441 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d60134-60c3-498e-9550-fafb7900fcf1" containerName="extract" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.380446 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d60134-60c3-498e-9550-fafb7900fcf1" containerName="extract" Oct 08 22:36:58 crc kubenswrapper[4834]: E1008 22:36:58.380456 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8e1e17-2b8b-4b54-87a9-acaf42412344" containerName="extract-utilities" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.380462 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8e1e17-2b8b-4b54-87a9-acaf42412344" containerName="extract-utilities" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.380564 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7d60134-60c3-498e-9550-fafb7900fcf1" containerName="extract" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.380573 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8e1e17-2b8b-4b54-87a9-acaf42412344" containerName="registry-server" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.381071 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.383579 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.383919 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.383489 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.384233 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-wtpqg" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.385395 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.399571 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs"] Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.447642 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcwgk\" (UniqueName: \"kubernetes.io/projected/3ad77aef-6ec4-4902-b24e-64599745e983-kube-api-access-mcwgk\") pod \"metallb-operator-controller-manager-64c74dd74f-st2xs\" (UID: \"3ad77aef-6ec4-4902-b24e-64599745e983\") " pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.447885 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ad77aef-6ec4-4902-b24e-64599745e983-apiservice-cert\") pod \"metallb-operator-controller-manager-64c74dd74f-st2xs\" (UID: \"3ad77aef-6ec4-4902-b24e-64599745e983\") " pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.447934 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ad77aef-6ec4-4902-b24e-64599745e983-webhook-cert\") pod \"metallb-operator-controller-manager-64c74dd74f-st2xs\" (UID: \"3ad77aef-6ec4-4902-b24e-64599745e983\") " pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.548833 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ad77aef-6ec4-4902-b24e-64599745e983-webhook-cert\") pod \"metallb-operator-controller-manager-64c74dd74f-st2xs\" (UID: \"3ad77aef-6ec4-4902-b24e-64599745e983\") " pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.548898 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcwgk\" (UniqueName: \"kubernetes.io/projected/3ad77aef-6ec4-4902-b24e-64599745e983-kube-api-access-mcwgk\") pod \"metallb-operator-controller-manager-64c74dd74f-st2xs\" (UID: \"3ad77aef-6ec4-4902-b24e-64599745e983\") " pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.549008 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ad77aef-6ec4-4902-b24e-64599745e983-apiservice-cert\") pod \"metallb-operator-controller-manager-64c74dd74f-st2xs\" (UID: \"3ad77aef-6ec4-4902-b24e-64599745e983\") " pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.560828 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ad77aef-6ec4-4902-b24e-64599745e983-apiservice-cert\") pod \"metallb-operator-controller-manager-64c74dd74f-st2xs\" (UID: \"3ad77aef-6ec4-4902-b24e-64599745e983\") " pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.562254 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ad77aef-6ec4-4902-b24e-64599745e983-webhook-cert\") pod \"metallb-operator-controller-manager-64c74dd74f-st2xs\" (UID: \"3ad77aef-6ec4-4902-b24e-64599745e983\") " pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.578881 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcwgk\" (UniqueName: \"kubernetes.io/projected/3ad77aef-6ec4-4902-b24e-64599745e983-kube-api-access-mcwgk\") pod \"metallb-operator-controller-manager-64c74dd74f-st2xs\" (UID: \"3ad77aef-6ec4-4902-b24e-64599745e983\") " pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.665011 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq"] Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.665960 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.668367 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.668662 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-pn2sq" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.669612 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.695976 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.735518 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq"] Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.751891 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5afee87-da07-475a-b94a-8a473a64be9b-webhook-cert\") pod \"metallb-operator-webhook-server-7ffb9d7cb9-tz7qq\" (UID: \"e5afee87-da07-475a-b94a-8a473a64be9b\") " pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.751974 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5afee87-da07-475a-b94a-8a473a64be9b-apiservice-cert\") pod \"metallb-operator-webhook-server-7ffb9d7cb9-tz7qq\" (UID: \"e5afee87-da07-475a-b94a-8a473a64be9b\") " pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.752012 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnmkk\" (UniqueName: \"kubernetes.io/projected/e5afee87-da07-475a-b94a-8a473a64be9b-kube-api-access-xnmkk\") pod \"metallb-operator-webhook-server-7ffb9d7cb9-tz7qq\" (UID: \"e5afee87-da07-475a-b94a-8a473a64be9b\") " pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.854897 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5afee87-da07-475a-b94a-8a473a64be9b-apiservice-cert\") pod \"metallb-operator-webhook-server-7ffb9d7cb9-tz7qq\" (UID: \"e5afee87-da07-475a-b94a-8a473a64be9b\") " pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.855436 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnmkk\" (UniqueName: \"kubernetes.io/projected/e5afee87-da07-475a-b94a-8a473a64be9b-kube-api-access-xnmkk\") pod \"metallb-operator-webhook-server-7ffb9d7cb9-tz7qq\" (UID: \"e5afee87-da07-475a-b94a-8a473a64be9b\") " pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.855488 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5afee87-da07-475a-b94a-8a473a64be9b-webhook-cert\") pod \"metallb-operator-webhook-server-7ffb9d7cb9-tz7qq\" (UID: \"e5afee87-da07-475a-b94a-8a473a64be9b\") " pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.861512 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e5afee87-da07-475a-b94a-8a473a64be9b-webhook-cert\") pod \"metallb-operator-webhook-server-7ffb9d7cb9-tz7qq\" (UID: \"e5afee87-da07-475a-b94a-8a473a64be9b\") " pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.863789 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e5afee87-da07-475a-b94a-8a473a64be9b-apiservice-cert\") pod \"metallb-operator-webhook-server-7ffb9d7cb9-tz7qq\" (UID: \"e5afee87-da07-475a-b94a-8a473a64be9b\") " pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.886340 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnmkk\" (UniqueName: \"kubernetes.io/projected/e5afee87-da07-475a-b94a-8a473a64be9b-kube-api-access-xnmkk\") pod \"metallb-operator-webhook-server-7ffb9d7cb9-tz7qq\" (UID: \"e5afee87-da07-475a-b94a-8a473a64be9b\") " pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.963938 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs"] Oct 08 22:36:58 crc kubenswrapper[4834]: I1008 22:36:58.981231 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" Oct 08 22:36:59 crc kubenswrapper[4834]: I1008 22:36:59.409021 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq"] Oct 08 22:36:59 crc kubenswrapper[4834]: W1008 22:36:59.416879 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5afee87_da07_475a_b94a_8a473a64be9b.slice/crio-f0ce3d11242c0cdcdc39e588a1ae73534d631528466a46527cb0f773ce361a39 WatchSource:0}: Error finding container f0ce3d11242c0cdcdc39e588a1ae73534d631528466a46527cb0f773ce361a39: Status 404 returned error can't find the container with id f0ce3d11242c0cdcdc39e588a1ae73534d631528466a46527cb0f773ce361a39 Oct 08 22:36:59 crc kubenswrapper[4834]: I1008 22:36:59.563631 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8e1e17-2b8b-4b54-87a9-acaf42412344" path="/var/lib/kubelet/pods/8c8e1e17-2b8b-4b54-87a9-acaf42412344/volumes" Oct 08 22:36:59 crc kubenswrapper[4834]: I1008 22:36:59.567333 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" event={"ID":"3ad77aef-6ec4-4902-b24e-64599745e983","Type":"ContainerStarted","Data":"e86c297db59774ffebe327e6806707ee44d0507cac2d41ed1589740e1844adfe"} Oct 08 22:36:59 crc kubenswrapper[4834]: I1008 22:36:59.568508 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" event={"ID":"e5afee87-da07-475a-b94a-8a473a64be9b","Type":"ContainerStarted","Data":"f0ce3d11242c0cdcdc39e588a1ae73534d631528466a46527cb0f773ce361a39"} Oct 08 22:37:03 crc kubenswrapper[4834]: I1008 22:37:03.599243 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" event={"ID":"3ad77aef-6ec4-4902-b24e-64599745e983","Type":"ContainerStarted","Data":"720e9982d2c34db3a8022fd74c993e25c52b746e2e18073a05508c8e344baa6f"} Oct 08 22:37:03 crc kubenswrapper[4834]: I1008 22:37:03.600026 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" Oct 08 22:37:03 crc kubenswrapper[4834]: I1008 22:37:03.636106 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" podStartSLOduration=2.058530849 podStartE2EDuration="5.636084809s" podCreationTimestamp="2025-10-08 22:36:58 +0000 UTC" firstStartedPulling="2025-10-08 22:36:58.979213697 +0000 UTC m=+826.802098443" lastFinishedPulling="2025-10-08 22:37:02.556767657 +0000 UTC m=+830.379652403" observedRunningTime="2025-10-08 22:37:03.634686035 +0000 UTC m=+831.457570781" watchObservedRunningTime="2025-10-08 22:37:03.636084809 +0000 UTC m=+831.458969555" Oct 08 22:37:04 crc kubenswrapper[4834]: I1008 22:37:04.607255 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" event={"ID":"e5afee87-da07-475a-b94a-8a473a64be9b","Type":"ContainerStarted","Data":"9ce3e8025310c53f09275be54ff1f3f463a029bafa116cf16ef2efcbec7267fc"} Oct 08 22:37:04 crc kubenswrapper[4834]: I1008 22:37:04.648052 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" podStartSLOduration=1.720060079 podStartE2EDuration="6.648031149s" podCreationTimestamp="2025-10-08 22:36:58 +0000 UTC" firstStartedPulling="2025-10-08 22:36:59.420127234 +0000 UTC m=+827.243011970" lastFinishedPulling="2025-10-08 22:37:04.348098284 +0000 UTC m=+832.170983040" observedRunningTime="2025-10-08 22:37:04.643831136 +0000 UTC m=+832.466715882" watchObservedRunningTime="2025-10-08 22:37:04.648031149 +0000 UTC m=+832.470915895" Oct 08 22:37:05 crc kubenswrapper[4834]: I1008 22:37:05.613891 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" Oct 08 22:37:18 crc kubenswrapper[4834]: I1008 22:37:18.987369 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7ffb9d7cb9-tz7qq" Oct 08 22:37:27 crc kubenswrapper[4834]: I1008 22:37:27.742388 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sq4wf"] Oct 08 22:37:27 crc kubenswrapper[4834]: I1008 22:37:27.746064 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:27 crc kubenswrapper[4834]: I1008 22:37:27.766925 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sq4wf"] Oct 08 22:37:27 crc kubenswrapper[4834]: I1008 22:37:27.893407 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snchv\" (UniqueName: \"kubernetes.io/projected/919ca273-8930-4c3c-9e9a-4188d217ec74-kube-api-access-snchv\") pod \"certified-operators-sq4wf\" (UID: \"919ca273-8930-4c3c-9e9a-4188d217ec74\") " pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:27 crc kubenswrapper[4834]: I1008 22:37:27.893728 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919ca273-8930-4c3c-9e9a-4188d217ec74-catalog-content\") pod \"certified-operators-sq4wf\" (UID: \"919ca273-8930-4c3c-9e9a-4188d217ec74\") " pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:27 crc kubenswrapper[4834]: I1008 22:37:27.893862 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919ca273-8930-4c3c-9e9a-4188d217ec74-utilities\") pod \"certified-operators-sq4wf\" (UID: \"919ca273-8930-4c3c-9e9a-4188d217ec74\") " pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:27 crc kubenswrapper[4834]: I1008 22:37:27.995666 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919ca273-8930-4c3c-9e9a-4188d217ec74-catalog-content\") pod \"certified-operators-sq4wf\" (UID: \"919ca273-8930-4c3c-9e9a-4188d217ec74\") " pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:27 crc kubenswrapper[4834]: I1008 22:37:27.995729 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919ca273-8930-4c3c-9e9a-4188d217ec74-utilities\") pod \"certified-operators-sq4wf\" (UID: \"919ca273-8930-4c3c-9e9a-4188d217ec74\") " pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:27 crc kubenswrapper[4834]: I1008 22:37:27.995776 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snchv\" (UniqueName: \"kubernetes.io/projected/919ca273-8930-4c3c-9e9a-4188d217ec74-kube-api-access-snchv\") pod \"certified-operators-sq4wf\" (UID: \"919ca273-8930-4c3c-9e9a-4188d217ec74\") " pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:27 crc kubenswrapper[4834]: I1008 22:37:27.996501 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919ca273-8930-4c3c-9e9a-4188d217ec74-catalog-content\") pod \"certified-operators-sq4wf\" (UID: \"919ca273-8930-4c3c-9e9a-4188d217ec74\") " pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:27 crc kubenswrapper[4834]: I1008 22:37:27.996517 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919ca273-8930-4c3c-9e9a-4188d217ec74-utilities\") pod \"certified-operators-sq4wf\" (UID: \"919ca273-8930-4c3c-9e9a-4188d217ec74\") " pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:28 crc kubenswrapper[4834]: I1008 22:37:28.041356 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snchv\" (UniqueName: \"kubernetes.io/projected/919ca273-8930-4c3c-9e9a-4188d217ec74-kube-api-access-snchv\") pod \"certified-operators-sq4wf\" (UID: \"919ca273-8930-4c3c-9e9a-4188d217ec74\") " pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:28 crc kubenswrapper[4834]: I1008 22:37:28.082689 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:28 crc kubenswrapper[4834]: I1008 22:37:28.606533 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sq4wf"] Oct 08 22:37:28 crc kubenswrapper[4834]: I1008 22:37:28.785277 4834 generic.go:334] "Generic (PLEG): container finished" podID="919ca273-8930-4c3c-9e9a-4188d217ec74" containerID="26400407551dd3692f8d73e408e4e5eec1c10d42ea83089dd2618adc5ec120f3" exitCode=0 Oct 08 22:37:28 crc kubenswrapper[4834]: I1008 22:37:28.785329 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq4wf" event={"ID":"919ca273-8930-4c3c-9e9a-4188d217ec74","Type":"ContainerDied","Data":"26400407551dd3692f8d73e408e4e5eec1c10d42ea83089dd2618adc5ec120f3"} Oct 08 22:37:28 crc kubenswrapper[4834]: I1008 22:37:28.785365 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq4wf" event={"ID":"919ca273-8930-4c3c-9e9a-4188d217ec74","Type":"ContainerStarted","Data":"628c075af07866bd4dbfce8c48d2dffa41c3a4c7bde43563812307bac880fa63"} Oct 08 22:37:29 crc kubenswrapper[4834]: I1008 22:37:29.797336 4834 generic.go:334] "Generic (PLEG): container finished" podID="919ca273-8930-4c3c-9e9a-4188d217ec74" containerID="d81439846290b3a92f2405a2576d5ab16efd2c1af870143e20b9d5f0129cdb4d" exitCode=0 Oct 08 22:37:29 crc kubenswrapper[4834]: I1008 22:37:29.797464 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq4wf" event={"ID":"919ca273-8930-4c3c-9e9a-4188d217ec74","Type":"ContainerDied","Data":"d81439846290b3a92f2405a2576d5ab16efd2c1af870143e20b9d5f0129cdb4d"} Oct 08 22:37:30 crc kubenswrapper[4834]: I1008 22:37:30.809012 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq4wf" event={"ID":"919ca273-8930-4c3c-9e9a-4188d217ec74","Type":"ContainerStarted","Data":"ff7afe109e3bf3d416b1b1957bf419ba4ad42191dc32f71e6eaf87b562a53208"} Oct 08 22:37:30 crc kubenswrapper[4834]: I1008 22:37:30.835214 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sq4wf" podStartSLOduration=2.267570126 podStartE2EDuration="3.835197597s" podCreationTimestamp="2025-10-08 22:37:27 +0000 UTC" firstStartedPulling="2025-10-08 22:37:28.787107462 +0000 UTC m=+856.609992208" lastFinishedPulling="2025-10-08 22:37:30.354734933 +0000 UTC m=+858.177619679" observedRunningTime="2025-10-08 22:37:30.833939726 +0000 UTC m=+858.656824472" watchObservedRunningTime="2025-10-08 22:37:30.835197597 +0000 UTC m=+858.658082343" Oct 08 22:37:38 crc kubenswrapper[4834]: I1008 22:37:38.083681 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:38 crc kubenswrapper[4834]: I1008 22:37:38.084532 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:38 crc kubenswrapper[4834]: I1008 22:37:38.135451 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:38 crc kubenswrapper[4834]: I1008 22:37:38.700860 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-64c74dd74f-st2xs" Oct 08 22:37:38 crc kubenswrapper[4834]: I1008 22:37:38.949203 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.478773 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx"] Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.479635 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.482809 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-xcj5n" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.485406 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.492009 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-7mpst"] Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.495169 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.496297 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx"] Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.498459 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.504074 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.567117 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxzqz\" (UniqueName: \"kubernetes.io/projected/ca858c41-1390-4fcb-84a9-07b0482b6996-kube-api-access-rxzqz\") pod \"frr-k8s-webhook-server-64bf5d555-czqvx\" (UID: \"ca858c41-1390-4fcb-84a9-07b0482b6996\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.567196 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e5e9deb8-fb15-4cfb-8104-52006098ee11-frr-startup\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.567214 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e5e9deb8-fb15-4cfb-8104-52006098ee11-frr-conf\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.567238 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5e9deb8-fb15-4cfb-8104-52006098ee11-metrics-certs\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.567269 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e5e9deb8-fb15-4cfb-8104-52006098ee11-frr-sockets\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.567284 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca858c41-1390-4fcb-84a9-07b0482b6996-cert\") pod \"frr-k8s-webhook-server-64bf5d555-czqvx\" (UID: \"ca858c41-1390-4fcb-84a9-07b0482b6996\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.567299 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e5e9deb8-fb15-4cfb-8104-52006098ee11-metrics\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.567350 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e5e9deb8-fb15-4cfb-8104-52006098ee11-reloader\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.567365 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9qtg\" (UniqueName: \"kubernetes.io/projected/e5e9deb8-fb15-4cfb-8104-52006098ee11-kube-api-access-c9qtg\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.580234 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-n8v9d"] Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.581184 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-n8v9d" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.583945 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.584689 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.585900 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.586434 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-kjxlr" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.600102 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-g8nxg"] Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.601326 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-g8nxg" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.603761 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.623313 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-g8nxg"] Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.668367 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e5e9deb8-fb15-4cfb-8104-52006098ee11-frr-startup\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.668427 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c01f98dd-783e-431d-a695-053b316e9c60-cert\") pod \"controller-68d546b9d8-g8nxg\" (UID: \"c01f98dd-783e-431d-a695-053b316e9c60\") " pod="metallb-system/controller-68d546b9d8-g8nxg" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.668455 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e5e9deb8-fb15-4cfb-8104-52006098ee11-frr-conf\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.668486 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-metrics-certs\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.668516 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5e9deb8-fb15-4cfb-8104-52006098ee11-metrics-certs\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.668587 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e5e9deb8-fb15-4cfb-8104-52006098ee11-frr-sockets\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.668615 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lplzc\" (UniqueName: \"kubernetes.io/projected/c01f98dd-783e-431d-a695-053b316e9c60-kube-api-access-lplzc\") pod \"controller-68d546b9d8-g8nxg\" (UID: \"c01f98dd-783e-431d-a695-053b316e9c60\") " pod="metallb-system/controller-68d546b9d8-g8nxg" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.668641 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca858c41-1390-4fcb-84a9-07b0482b6996-cert\") pod \"frr-k8s-webhook-server-64bf5d555-czqvx\" (UID: \"ca858c41-1390-4fcb-84a9-07b0482b6996\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.668664 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e5e9deb8-fb15-4cfb-8104-52006098ee11-metrics\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.668694 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7rcm\" (UniqueName: \"kubernetes.io/projected/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-kube-api-access-n7rcm\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.668926 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-memberlist\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.668964 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e5e9deb8-fb15-4cfb-8104-52006098ee11-reloader\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.668987 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9qtg\" (UniqueName: \"kubernetes.io/projected/e5e9deb8-fb15-4cfb-8104-52006098ee11-kube-api-access-c9qtg\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.669018 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxzqz\" (UniqueName: \"kubernetes.io/projected/ca858c41-1390-4fcb-84a9-07b0482b6996-kube-api-access-rxzqz\") pod \"frr-k8s-webhook-server-64bf5d555-czqvx\" (UID: \"ca858c41-1390-4fcb-84a9-07b0482b6996\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.669044 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c01f98dd-783e-431d-a695-053b316e9c60-metrics-certs\") pod \"controller-68d546b9d8-g8nxg\" (UID: \"c01f98dd-783e-431d-a695-053b316e9c60\") " pod="metallb-system/controller-68d546b9d8-g8nxg" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.669071 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-metallb-excludel2\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:39 crc kubenswrapper[4834]: E1008 22:37:39.669928 4834 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 08 22:37:39 crc kubenswrapper[4834]: E1008 22:37:39.670038 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5e9deb8-fb15-4cfb-8104-52006098ee11-metrics-certs podName:e5e9deb8-fb15-4cfb-8104-52006098ee11 nodeName:}" failed. No retries permitted until 2025-10-08 22:37:40.17001354 +0000 UTC m=+867.992898296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5e9deb8-fb15-4cfb-8104-52006098ee11-metrics-certs") pod "frr-k8s-7mpst" (UID: "e5e9deb8-fb15-4cfb-8104-52006098ee11") : secret "frr-k8s-certs-secret" not found Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.670047 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e5e9deb8-fb15-4cfb-8104-52006098ee11-frr-conf\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.670458 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e5e9deb8-fb15-4cfb-8104-52006098ee11-frr-startup\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.671033 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e5e9deb8-fb15-4cfb-8104-52006098ee11-reloader\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.671329 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e5e9deb8-fb15-4cfb-8104-52006098ee11-frr-sockets\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.671521 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e5e9deb8-fb15-4cfb-8104-52006098ee11-metrics\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.678747 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca858c41-1390-4fcb-84a9-07b0482b6996-cert\") pod \"frr-k8s-webhook-server-64bf5d555-czqvx\" (UID: \"ca858c41-1390-4fcb-84a9-07b0482b6996\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.690610 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxzqz\" (UniqueName: \"kubernetes.io/projected/ca858c41-1390-4fcb-84a9-07b0482b6996-kube-api-access-rxzqz\") pod \"frr-k8s-webhook-server-64bf5d555-czqvx\" (UID: \"ca858c41-1390-4fcb-84a9-07b0482b6996\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.692196 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9qtg\" (UniqueName: \"kubernetes.io/projected/e5e9deb8-fb15-4cfb-8104-52006098ee11-kube-api-access-c9qtg\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.770257 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c01f98dd-783e-431d-a695-053b316e9c60-metrics-certs\") pod \"controller-68d546b9d8-g8nxg\" (UID: \"c01f98dd-783e-431d-a695-053b316e9c60\") " pod="metallb-system/controller-68d546b9d8-g8nxg" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.770343 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-metallb-excludel2\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.770407 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c01f98dd-783e-431d-a695-053b316e9c60-cert\") pod \"controller-68d546b9d8-g8nxg\" (UID: \"c01f98dd-783e-431d-a695-053b316e9c60\") " pod="metallb-system/controller-68d546b9d8-g8nxg" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.770451 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-metrics-certs\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:39 crc kubenswrapper[4834]: E1008 22:37:39.770501 4834 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.770543 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lplzc\" (UniqueName: \"kubernetes.io/projected/c01f98dd-783e-431d-a695-053b316e9c60-kube-api-access-lplzc\") pod \"controller-68d546b9d8-g8nxg\" (UID: \"c01f98dd-783e-431d-a695-053b316e9c60\") " pod="metallb-system/controller-68d546b9d8-g8nxg" Oct 08 22:37:39 crc kubenswrapper[4834]: E1008 22:37:39.770593 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c01f98dd-783e-431d-a695-053b316e9c60-metrics-certs podName:c01f98dd-783e-431d-a695-053b316e9c60 nodeName:}" failed. No retries permitted until 2025-10-08 22:37:40.270571988 +0000 UTC m=+868.093456724 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c01f98dd-783e-431d-a695-053b316e9c60-metrics-certs") pod "controller-68d546b9d8-g8nxg" (UID: "c01f98dd-783e-431d-a695-053b316e9c60") : secret "controller-certs-secret" not found Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.770625 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7rcm\" (UniqueName: \"kubernetes.io/projected/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-kube-api-access-n7rcm\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.770718 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-memberlist\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:39 crc kubenswrapper[4834]: E1008 22:37:39.770837 4834 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 08 22:37:39 crc kubenswrapper[4834]: E1008 22:37:39.770863 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-memberlist podName:53b63e43-0ee5-4d6c-b029-5d24b2d5aa96 nodeName:}" failed. No retries permitted until 2025-10-08 22:37:40.270856405 +0000 UTC m=+868.093741151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-memberlist") pod "speaker-n8v9d" (UID: "53b63e43-0ee5-4d6c-b029-5d24b2d5aa96") : secret "metallb-memberlist" not found Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.771971 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-metallb-excludel2\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.773127 4834 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.776521 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-metrics-certs\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.788395 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c01f98dd-783e-431d-a695-053b316e9c60-cert\") pod \"controller-68d546b9d8-g8nxg\" (UID: \"c01f98dd-783e-431d-a695-053b316e9c60\") " pod="metallb-system/controller-68d546b9d8-g8nxg" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.788441 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lplzc\" (UniqueName: \"kubernetes.io/projected/c01f98dd-783e-431d-a695-053b316e9c60-kube-api-access-lplzc\") pod \"controller-68d546b9d8-g8nxg\" (UID: \"c01f98dd-783e-431d-a695-053b316e9c60\") " pod="metallb-system/controller-68d546b9d8-g8nxg" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.792136 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7rcm\" (UniqueName: \"kubernetes.io/projected/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-kube-api-access-n7rcm\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:39 crc kubenswrapper[4834]: I1008 22:37:39.806669 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx" Oct 08 22:37:40 crc kubenswrapper[4834]: I1008 22:37:40.176993 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5e9deb8-fb15-4cfb-8104-52006098ee11-metrics-certs\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:40 crc kubenswrapper[4834]: I1008 22:37:40.187133 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5e9deb8-fb15-4cfb-8104-52006098ee11-metrics-certs\") pod \"frr-k8s-7mpst\" (UID: \"e5e9deb8-fb15-4cfb-8104-52006098ee11\") " pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:40 crc kubenswrapper[4834]: I1008 22:37:40.246749 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx"] Oct 08 22:37:40 crc kubenswrapper[4834]: I1008 22:37:40.279581 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-memberlist\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:40 crc kubenswrapper[4834]: I1008 22:37:40.279709 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c01f98dd-783e-431d-a695-053b316e9c60-metrics-certs\") pod \"controller-68d546b9d8-g8nxg\" (UID: \"c01f98dd-783e-431d-a695-053b316e9c60\") " pod="metallb-system/controller-68d546b9d8-g8nxg" Oct 08 22:37:40 crc kubenswrapper[4834]: E1008 22:37:40.279861 4834 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 08 22:37:40 crc kubenswrapper[4834]: E1008 22:37:40.280016 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-memberlist podName:53b63e43-0ee5-4d6c-b029-5d24b2d5aa96 nodeName:}" failed. No retries permitted until 2025-10-08 22:37:41.279977363 +0000 UTC m=+869.102862189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-memberlist") pod "speaker-n8v9d" (UID: "53b63e43-0ee5-4d6c-b029-5d24b2d5aa96") : secret "metallb-memberlist" not found Oct 08 22:37:40 crc kubenswrapper[4834]: I1008 22:37:40.285006 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c01f98dd-783e-431d-a695-053b316e9c60-metrics-certs\") pod \"controller-68d546b9d8-g8nxg\" (UID: \"c01f98dd-783e-431d-a695-053b316e9c60\") " pod="metallb-system/controller-68d546b9d8-g8nxg" Oct 08 22:37:40 crc kubenswrapper[4834]: I1008 22:37:40.418701 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:40 crc kubenswrapper[4834]: I1008 22:37:40.524745 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-g8nxg" Oct 08 22:37:40 crc kubenswrapper[4834]: I1008 22:37:40.530500 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sq4wf"] Oct 08 22:37:40 crc kubenswrapper[4834]: I1008 22:37:40.902325 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx" event={"ID":"ca858c41-1390-4fcb-84a9-07b0482b6996","Type":"ContainerStarted","Data":"1a0a2dfe99906a14c4a009d44789e53b6e25a2f8c128aee525d673b331abc4a9"} Oct 08 22:37:40 crc kubenswrapper[4834]: I1008 22:37:40.906138 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7mpst" event={"ID":"e5e9deb8-fb15-4cfb-8104-52006098ee11","Type":"ContainerStarted","Data":"3a5e36f1c4a6bd43ab84c518bcfd2d66e3c282ccf6a869545c73659fb31bdbcd"} Oct 08 22:37:40 crc kubenswrapper[4834]: I1008 22:37:40.906477 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sq4wf" podUID="919ca273-8930-4c3c-9e9a-4188d217ec74" containerName="registry-server" containerID="cri-o://ff7afe109e3bf3d416b1b1957bf419ba4ad42191dc32f71e6eaf87b562a53208" gracePeriod=2 Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.028432 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-g8nxg"] Oct 08 22:37:41 crc kubenswrapper[4834]: W1008 22:37:41.049884 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc01f98dd_783e_431d_a695_053b316e9c60.slice/crio-cc287d58a8126be1b1bfd9d917feface23e2810e6e4cd963e5c2de3195820059 WatchSource:0}: Error finding container cc287d58a8126be1b1bfd9d917feface23e2810e6e4cd963e5c2de3195820059: Status 404 returned error can't find the container with id cc287d58a8126be1b1bfd9d917feface23e2810e6e4cd963e5c2de3195820059 Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.298132 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-memberlist\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:41 crc kubenswrapper[4834]: E1008 22:37:41.298515 4834 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 08 22:37:41 crc kubenswrapper[4834]: E1008 22:37:41.298738 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-memberlist podName:53b63e43-0ee5-4d6c-b029-5d24b2d5aa96 nodeName:}" failed. No retries permitted until 2025-10-08 22:37:43.298711231 +0000 UTC m=+871.121595997 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-memberlist") pod "speaker-n8v9d" (UID: "53b63e43-0ee5-4d6c-b029-5d24b2d5aa96") : secret "metallb-memberlist" not found Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.375668 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.501043 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snchv\" (UniqueName: \"kubernetes.io/projected/919ca273-8930-4c3c-9e9a-4188d217ec74-kube-api-access-snchv\") pod \"919ca273-8930-4c3c-9e9a-4188d217ec74\" (UID: \"919ca273-8930-4c3c-9e9a-4188d217ec74\") " Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.501180 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919ca273-8930-4c3c-9e9a-4188d217ec74-utilities\") pod \"919ca273-8930-4c3c-9e9a-4188d217ec74\" (UID: \"919ca273-8930-4c3c-9e9a-4188d217ec74\") " Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.501235 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919ca273-8930-4c3c-9e9a-4188d217ec74-catalog-content\") pod \"919ca273-8930-4c3c-9e9a-4188d217ec74\" (UID: \"919ca273-8930-4c3c-9e9a-4188d217ec74\") " Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.502608 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919ca273-8930-4c3c-9e9a-4188d217ec74-utilities" (OuterVolumeSpecName: "utilities") pod "919ca273-8930-4c3c-9e9a-4188d217ec74" (UID: "919ca273-8930-4c3c-9e9a-4188d217ec74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.508492 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919ca273-8930-4c3c-9e9a-4188d217ec74-kube-api-access-snchv" (OuterVolumeSpecName: "kube-api-access-snchv") pod "919ca273-8930-4c3c-9e9a-4188d217ec74" (UID: "919ca273-8930-4c3c-9e9a-4188d217ec74"). InnerVolumeSpecName "kube-api-access-snchv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.549976 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/919ca273-8930-4c3c-9e9a-4188d217ec74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "919ca273-8930-4c3c-9e9a-4188d217ec74" (UID: "919ca273-8930-4c3c-9e9a-4188d217ec74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.602891 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snchv\" (UniqueName: \"kubernetes.io/projected/919ca273-8930-4c3c-9e9a-4188d217ec74-kube-api-access-snchv\") on node \"crc\" DevicePath \"\"" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.602955 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/919ca273-8930-4c3c-9e9a-4188d217ec74-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.602970 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/919ca273-8930-4c3c-9e9a-4188d217ec74-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.915350 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-g8nxg" event={"ID":"c01f98dd-783e-431d-a695-053b316e9c60","Type":"ContainerStarted","Data":"a08fbcadc01105ace0f3303c685f27c696ec04c92ace28a74034bd7acedaf27a"} Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.915408 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-g8nxg" event={"ID":"c01f98dd-783e-431d-a695-053b316e9c60","Type":"ContainerStarted","Data":"e33ec035270e4ae7708c48bf0e869df43318bffe8f01518392532cca81ad3f75"} Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.915419 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-g8nxg" event={"ID":"c01f98dd-783e-431d-a695-053b316e9c60","Type":"ContainerStarted","Data":"cc287d58a8126be1b1bfd9d917feface23e2810e6e4cd963e5c2de3195820059"} Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.915492 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-g8nxg" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.918555 4834 generic.go:334] "Generic (PLEG): container finished" podID="919ca273-8930-4c3c-9e9a-4188d217ec74" containerID="ff7afe109e3bf3d416b1b1957bf419ba4ad42191dc32f71e6eaf87b562a53208" exitCode=0 Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.918598 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq4wf" event={"ID":"919ca273-8930-4c3c-9e9a-4188d217ec74","Type":"ContainerDied","Data":"ff7afe109e3bf3d416b1b1957bf419ba4ad42191dc32f71e6eaf87b562a53208"} Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.918625 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq4wf" event={"ID":"919ca273-8930-4c3c-9e9a-4188d217ec74","Type":"ContainerDied","Data":"628c075af07866bd4dbfce8c48d2dffa41c3a4c7bde43563812307bac880fa63"} Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.918645 4834 scope.go:117] "RemoveContainer" containerID="ff7afe109e3bf3d416b1b1957bf419ba4ad42191dc32f71e6eaf87b562a53208" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.918774 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq4wf" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.937909 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-g8nxg" podStartSLOduration=2.9378886509999997 podStartE2EDuration="2.937888651s" podCreationTimestamp="2025-10-08 22:37:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:37:41.934281013 +0000 UTC m=+869.757165759" watchObservedRunningTime="2025-10-08 22:37:41.937888651 +0000 UTC m=+869.760773397" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.944087 4834 scope.go:117] "RemoveContainer" containerID="d81439846290b3a92f2405a2576d5ab16efd2c1af870143e20b9d5f0129cdb4d" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.949524 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sq4wf"] Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.951366 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sq4wf"] Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.977163 4834 scope.go:117] "RemoveContainer" containerID="26400407551dd3692f8d73e408e4e5eec1c10d42ea83089dd2618adc5ec120f3" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.997010 4834 scope.go:117] "RemoveContainer" containerID="ff7afe109e3bf3d416b1b1957bf419ba4ad42191dc32f71e6eaf87b562a53208" Oct 08 22:37:41 crc kubenswrapper[4834]: E1008 22:37:41.998086 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7afe109e3bf3d416b1b1957bf419ba4ad42191dc32f71e6eaf87b562a53208\": container with ID starting with ff7afe109e3bf3d416b1b1957bf419ba4ad42191dc32f71e6eaf87b562a53208 not found: ID does not exist" containerID="ff7afe109e3bf3d416b1b1957bf419ba4ad42191dc32f71e6eaf87b562a53208" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.998118 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7afe109e3bf3d416b1b1957bf419ba4ad42191dc32f71e6eaf87b562a53208"} err="failed to get container status \"ff7afe109e3bf3d416b1b1957bf419ba4ad42191dc32f71e6eaf87b562a53208\": rpc error: code = NotFound desc = could not find container \"ff7afe109e3bf3d416b1b1957bf419ba4ad42191dc32f71e6eaf87b562a53208\": container with ID starting with ff7afe109e3bf3d416b1b1957bf419ba4ad42191dc32f71e6eaf87b562a53208 not found: ID does not exist" Oct 08 22:37:41 crc kubenswrapper[4834]: I1008 22:37:41.998183 4834 scope.go:117] "RemoveContainer" containerID="d81439846290b3a92f2405a2576d5ab16efd2c1af870143e20b9d5f0129cdb4d" Oct 08 22:37:42 crc kubenswrapper[4834]: E1008 22:37:42.000262 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d81439846290b3a92f2405a2576d5ab16efd2c1af870143e20b9d5f0129cdb4d\": container with ID starting with d81439846290b3a92f2405a2576d5ab16efd2c1af870143e20b9d5f0129cdb4d not found: ID does not exist" containerID="d81439846290b3a92f2405a2576d5ab16efd2c1af870143e20b9d5f0129cdb4d" Oct 08 22:37:42 crc kubenswrapper[4834]: I1008 22:37:42.000314 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d81439846290b3a92f2405a2576d5ab16efd2c1af870143e20b9d5f0129cdb4d"} err="failed to get container status \"d81439846290b3a92f2405a2576d5ab16efd2c1af870143e20b9d5f0129cdb4d\": rpc error: code = NotFound desc = could not find container \"d81439846290b3a92f2405a2576d5ab16efd2c1af870143e20b9d5f0129cdb4d\": container with ID starting with d81439846290b3a92f2405a2576d5ab16efd2c1af870143e20b9d5f0129cdb4d not found: ID does not exist" Oct 08 22:37:42 crc kubenswrapper[4834]: I1008 22:37:42.000346 4834 scope.go:117] "RemoveContainer" containerID="26400407551dd3692f8d73e408e4e5eec1c10d42ea83089dd2618adc5ec120f3" Oct 08 22:37:42 crc kubenswrapper[4834]: E1008 22:37:42.002476 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26400407551dd3692f8d73e408e4e5eec1c10d42ea83089dd2618adc5ec120f3\": container with ID starting with 26400407551dd3692f8d73e408e4e5eec1c10d42ea83089dd2618adc5ec120f3 not found: ID does not exist" containerID="26400407551dd3692f8d73e408e4e5eec1c10d42ea83089dd2618adc5ec120f3" Oct 08 22:37:42 crc kubenswrapper[4834]: I1008 22:37:42.002530 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26400407551dd3692f8d73e408e4e5eec1c10d42ea83089dd2618adc5ec120f3"} err="failed to get container status \"26400407551dd3692f8d73e408e4e5eec1c10d42ea83089dd2618adc5ec120f3\": rpc error: code = NotFound desc = could not find container \"26400407551dd3692f8d73e408e4e5eec1c10d42ea83089dd2618adc5ec120f3\": container with ID starting with 26400407551dd3692f8d73e408e4e5eec1c10d42ea83089dd2618adc5ec120f3 not found: ID does not exist" Oct 08 22:37:43 crc kubenswrapper[4834]: I1008 22:37:43.329506 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-memberlist\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:43 crc kubenswrapper[4834]: I1008 22:37:43.333669 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/53b63e43-0ee5-4d6c-b029-5d24b2d5aa96-memberlist\") pod \"speaker-n8v9d\" (UID: \"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96\") " pod="metallb-system/speaker-n8v9d" Oct 08 22:37:43 crc kubenswrapper[4834]: I1008 22:37:43.495331 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-n8v9d" Oct 08 22:37:43 crc kubenswrapper[4834]: I1008 22:37:43.563470 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="919ca273-8930-4c3c-9e9a-4188d217ec74" path="/var/lib/kubelet/pods/919ca273-8930-4c3c-9e9a-4188d217ec74/volumes" Oct 08 22:37:43 crc kubenswrapper[4834]: I1008 22:37:43.935500 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-n8v9d" event={"ID":"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96","Type":"ContainerStarted","Data":"4b9dbe8a392ef752e41c18836dd22ca34ef83ce3297c4148bb1df55647a6dab5"} Oct 08 22:37:44 crc kubenswrapper[4834]: I1008 22:37:44.956925 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-n8v9d" event={"ID":"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96","Type":"ContainerStarted","Data":"b026d6df62f6e2dadd365132abdfc3b45443d111078e2e4cfd1ce4669776541a"} Oct 08 22:37:44 crc kubenswrapper[4834]: I1008 22:37:44.956989 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-n8v9d" event={"ID":"53b63e43-0ee5-4d6c-b029-5d24b2d5aa96","Type":"ContainerStarted","Data":"0d49fec546aa565d39e62cdbea3b3172e52eb7debee4ef643d6767b1faad611a"} Oct 08 22:37:44 crc kubenswrapper[4834]: I1008 22:37:44.957167 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-n8v9d" Oct 08 22:37:44 crc kubenswrapper[4834]: I1008 22:37:44.989396 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-n8v9d" podStartSLOduration=5.989369597 podStartE2EDuration="5.989369597s" podCreationTimestamp="2025-10-08 22:37:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:37:44.987533212 +0000 UTC m=+872.810417958" watchObservedRunningTime="2025-10-08 22:37:44.989369597 +0000 UTC m=+872.812254383" Oct 08 22:37:48 crc kubenswrapper[4834]: I1008 22:37:48.993724 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx" event={"ID":"ca858c41-1390-4fcb-84a9-07b0482b6996","Type":"ContainerStarted","Data":"50615de86e3bdd81ff4564870942efea00da58aaaf2b7e35609e6d29809549fe"} Oct 08 22:37:48 crc kubenswrapper[4834]: I1008 22:37:48.994775 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx" Oct 08 22:37:48 crc kubenswrapper[4834]: I1008 22:37:48.996533 4834 generic.go:334] "Generic (PLEG): container finished" podID="e5e9deb8-fb15-4cfb-8104-52006098ee11" containerID="95844f6898983006a1f2b6ff60298a960fc31439aba411d760247917cb5a0181" exitCode=0 Oct 08 22:37:48 crc kubenswrapper[4834]: I1008 22:37:48.996605 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7mpst" event={"ID":"e5e9deb8-fb15-4cfb-8104-52006098ee11","Type":"ContainerDied","Data":"95844f6898983006a1f2b6ff60298a960fc31439aba411d760247917cb5a0181"} Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.032258 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx" podStartSLOduration=2.445580796 podStartE2EDuration="10.032222129s" podCreationTimestamp="2025-10-08 22:37:39 +0000 UTC" firstStartedPulling="2025-10-08 22:37:40.265922268 +0000 UTC m=+868.088807054" lastFinishedPulling="2025-10-08 22:37:47.852563601 +0000 UTC m=+875.675448387" observedRunningTime="2025-10-08 22:37:49.017381845 +0000 UTC m=+876.840266601" watchObservedRunningTime="2025-10-08 22:37:49.032222129 +0000 UTC m=+876.855106905" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.142725 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-89b85"] Oct 08 22:37:49 crc kubenswrapper[4834]: E1008 22:37:49.143087 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919ca273-8930-4c3c-9e9a-4188d217ec74" containerName="extract-utilities" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.143116 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="919ca273-8930-4c3c-9e9a-4188d217ec74" containerName="extract-utilities" Oct 08 22:37:49 crc kubenswrapper[4834]: E1008 22:37:49.143139 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919ca273-8930-4c3c-9e9a-4188d217ec74" containerName="registry-server" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.143185 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="919ca273-8930-4c3c-9e9a-4188d217ec74" containerName="registry-server" Oct 08 22:37:49 crc kubenswrapper[4834]: E1008 22:37:49.143221 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919ca273-8930-4c3c-9e9a-4188d217ec74" containerName="extract-content" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.143236 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="919ca273-8930-4c3c-9e9a-4188d217ec74" containerName="extract-content" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.143450 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="919ca273-8930-4c3c-9e9a-4188d217ec74" containerName="registry-server" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.145087 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.161839 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-89b85"] Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.225463 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/503ae774-f984-43c7-ae8d-75c5afe127f9-catalog-content\") pod \"redhat-marketplace-89b85\" (UID: \"503ae774-f984-43c7-ae8d-75c5afe127f9\") " pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.225872 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkcln\" (UniqueName: \"kubernetes.io/projected/503ae774-f984-43c7-ae8d-75c5afe127f9-kube-api-access-wkcln\") pod \"redhat-marketplace-89b85\" (UID: \"503ae774-f984-43c7-ae8d-75c5afe127f9\") " pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.226008 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/503ae774-f984-43c7-ae8d-75c5afe127f9-utilities\") pod \"redhat-marketplace-89b85\" (UID: \"503ae774-f984-43c7-ae8d-75c5afe127f9\") " pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.327927 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkcln\" (UniqueName: \"kubernetes.io/projected/503ae774-f984-43c7-ae8d-75c5afe127f9-kube-api-access-wkcln\") pod \"redhat-marketplace-89b85\" (UID: \"503ae774-f984-43c7-ae8d-75c5afe127f9\") " pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.328012 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/503ae774-f984-43c7-ae8d-75c5afe127f9-utilities\") pod \"redhat-marketplace-89b85\" (UID: \"503ae774-f984-43c7-ae8d-75c5afe127f9\") " pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.328059 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/503ae774-f984-43c7-ae8d-75c5afe127f9-catalog-content\") pod \"redhat-marketplace-89b85\" (UID: \"503ae774-f984-43c7-ae8d-75c5afe127f9\") " pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.328831 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/503ae774-f984-43c7-ae8d-75c5afe127f9-catalog-content\") pod \"redhat-marketplace-89b85\" (UID: \"503ae774-f984-43c7-ae8d-75c5afe127f9\") " pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.329112 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/503ae774-f984-43c7-ae8d-75c5afe127f9-utilities\") pod \"redhat-marketplace-89b85\" (UID: \"503ae774-f984-43c7-ae8d-75c5afe127f9\") " pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.354279 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkcln\" (UniqueName: \"kubernetes.io/projected/503ae774-f984-43c7-ae8d-75c5afe127f9-kube-api-access-wkcln\") pod \"redhat-marketplace-89b85\" (UID: \"503ae774-f984-43c7-ae8d-75c5afe127f9\") " pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.464544 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:37:49 crc kubenswrapper[4834]: I1008 22:37:49.734479 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-89b85"] Oct 08 22:37:50 crc kubenswrapper[4834]: I1008 22:37:50.006627 4834 generic.go:334] "Generic (PLEG): container finished" podID="e5e9deb8-fb15-4cfb-8104-52006098ee11" containerID="01609db79f87147331ed4d768ca68224e1f4efb530c55e37f60d3d2c1d27c878" exitCode=0 Oct 08 22:37:50 crc kubenswrapper[4834]: I1008 22:37:50.006679 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7mpst" event={"ID":"e5e9deb8-fb15-4cfb-8104-52006098ee11","Type":"ContainerDied","Data":"01609db79f87147331ed4d768ca68224e1f4efb530c55e37f60d3d2c1d27c878"} Oct 08 22:37:50 crc kubenswrapper[4834]: I1008 22:37:50.008755 4834 generic.go:334] "Generic (PLEG): container finished" podID="503ae774-f984-43c7-ae8d-75c5afe127f9" containerID="07adf66fd2a156e47764a13c77a4131fb988938409332d2914093e3e03a1e1e4" exitCode=0 Oct 08 22:37:50 crc kubenswrapper[4834]: I1008 22:37:50.008798 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89b85" event={"ID":"503ae774-f984-43c7-ae8d-75c5afe127f9","Type":"ContainerDied","Data":"07adf66fd2a156e47764a13c77a4131fb988938409332d2914093e3e03a1e1e4"} Oct 08 22:37:50 crc kubenswrapper[4834]: I1008 22:37:50.008853 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89b85" event={"ID":"503ae774-f984-43c7-ae8d-75c5afe127f9","Type":"ContainerStarted","Data":"a6702befd5950c17f819eec0c6b4714ed18e315e41f5317d532e0680a8f004d3"} Oct 08 22:37:51 crc kubenswrapper[4834]: I1008 22:37:51.017745 4834 generic.go:334] "Generic (PLEG): container finished" podID="e5e9deb8-fb15-4cfb-8104-52006098ee11" containerID="75ecbebf5a87821d80fac2470dce3346406333ee60551fd1ac2e74a8cb492746" exitCode=0 Oct 08 22:37:51 crc kubenswrapper[4834]: I1008 22:37:51.017876 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7mpst" event={"ID":"e5e9deb8-fb15-4cfb-8104-52006098ee11","Type":"ContainerDied","Data":"75ecbebf5a87821d80fac2470dce3346406333ee60551fd1ac2e74a8cb492746"} Oct 08 22:37:52 crc kubenswrapper[4834]: I1008 22:37:52.033633 4834 generic.go:334] "Generic (PLEG): container finished" podID="503ae774-f984-43c7-ae8d-75c5afe127f9" containerID="494d98b0fb2c50e20e04c11398628a65454c699bf356f0cb1d6e24ca1af36755" exitCode=0 Oct 08 22:37:52 crc kubenswrapper[4834]: I1008 22:37:52.033727 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89b85" event={"ID":"503ae774-f984-43c7-ae8d-75c5afe127f9","Type":"ContainerDied","Data":"494d98b0fb2c50e20e04c11398628a65454c699bf356f0cb1d6e24ca1af36755"} Oct 08 22:37:52 crc kubenswrapper[4834]: I1008 22:37:52.049689 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7mpst" event={"ID":"e5e9deb8-fb15-4cfb-8104-52006098ee11","Type":"ContainerStarted","Data":"e3c793d040bf42790ccef9f0abcead119655d79733d4523480a74af8b5ca72a8"} Oct 08 22:37:52 crc kubenswrapper[4834]: I1008 22:37:52.049915 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7mpst" event={"ID":"e5e9deb8-fb15-4cfb-8104-52006098ee11","Type":"ContainerStarted","Data":"6250ac8bb680ae46648f7abd515848532855ea2a90187afbb8647015595eb0c1"} Oct 08 22:37:52 crc kubenswrapper[4834]: I1008 22:37:52.049937 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7mpst" event={"ID":"e5e9deb8-fb15-4cfb-8104-52006098ee11","Type":"ContainerStarted","Data":"3e18a136e790bcb7e5f1bcde33a48a2c1756b0098d703cb28371e8bf194bb0a3"} Oct 08 22:37:52 crc kubenswrapper[4834]: I1008 22:37:52.049954 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7mpst" event={"ID":"e5e9deb8-fb15-4cfb-8104-52006098ee11","Type":"ContainerStarted","Data":"1e8147e55d3c93a1b0bc4bcdeb2e13f4833ce1e5b76201baf86bb012a023a3cd"} Oct 08 22:37:53 crc kubenswrapper[4834]: I1008 22:37:53.087388 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7mpst" event={"ID":"e5e9deb8-fb15-4cfb-8104-52006098ee11","Type":"ContainerStarted","Data":"9bbd2b580dae9452bf4f757711341358c27f2353de1317214127493cd3fe0c57"} Oct 08 22:37:53 crc kubenswrapper[4834]: I1008 22:37:53.087437 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7mpst" event={"ID":"e5e9deb8-fb15-4cfb-8104-52006098ee11","Type":"ContainerStarted","Data":"8b91c717c51f138812709270d34e4b75d0cd86132ad9da001ebba4bc72b0a348"} Oct 08 22:37:53 crc kubenswrapper[4834]: I1008 22:37:53.087998 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:53 crc kubenswrapper[4834]: I1008 22:37:53.089477 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89b85" event={"ID":"503ae774-f984-43c7-ae8d-75c5afe127f9","Type":"ContainerStarted","Data":"e0fc03dbefbaa736751a7bd440601f9ba8328d3abe112c7fb73b525609b5c9d4"} Oct 08 22:37:53 crc kubenswrapper[4834]: I1008 22:37:53.146265 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-89b85" podStartSLOduration=1.6074604369999999 podStartE2EDuration="4.146248338s" podCreationTimestamp="2025-10-08 22:37:49 +0000 UTC" firstStartedPulling="2025-10-08 22:37:50.010039892 +0000 UTC m=+877.832924638" lastFinishedPulling="2025-10-08 22:37:52.548827763 +0000 UTC m=+880.371712539" observedRunningTime="2025-10-08 22:37:53.145019248 +0000 UTC m=+880.967904004" watchObservedRunningTime="2025-10-08 22:37:53.146248338 +0000 UTC m=+880.969133084" Oct 08 22:37:53 crc kubenswrapper[4834]: I1008 22:37:53.155418 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-7mpst" podStartSLOduration=6.926162254 podStartE2EDuration="14.155389993s" podCreationTimestamp="2025-10-08 22:37:39 +0000 UTC" firstStartedPulling="2025-10-08 22:37:40.650660573 +0000 UTC m=+868.473545319" lastFinishedPulling="2025-10-08 22:37:47.879888302 +0000 UTC m=+875.702773058" observedRunningTime="2025-10-08 22:37:53.126397931 +0000 UTC m=+880.949282697" watchObservedRunningTime="2025-10-08 22:37:53.155389993 +0000 UTC m=+880.978274739" Oct 08 22:37:53 crc kubenswrapper[4834]: I1008 22:37:53.501320 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-n8v9d" Oct 08 22:37:54 crc kubenswrapper[4834]: I1008 22:37:54.997718 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r"] Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.000006 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.004824 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.018022 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r"] Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.121252 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r\" (UID: \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.121298 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r\" (UID: \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.121364 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr6lk\" (UniqueName: \"kubernetes.io/projected/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-kube-api-access-gr6lk\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r\" (UID: \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.223018 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r\" (UID: \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.223073 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r\" (UID: \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.223159 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr6lk\" (UniqueName: \"kubernetes.io/projected/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-kube-api-access-gr6lk\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r\" (UID: \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.223605 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r\" (UID: \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.223631 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r\" (UID: \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.241191 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr6lk\" (UniqueName: \"kubernetes.io/projected/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-kube-api-access-gr6lk\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r\" (UID: \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.315368 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.419466 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.461327 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-7mpst" Oct 08 22:37:55 crc kubenswrapper[4834]: I1008 22:37:55.729185 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r"] Oct 08 22:37:55 crc kubenswrapper[4834]: W1008 22:37:55.740086 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef56eccf_0b07_4f8d_b7fd_6f9c025fc856.slice/crio-4ba26ea32050fb2a0bcd13fc91482ab0d01bb740fa551363c1a34613b41af369 WatchSource:0}: Error finding container 4ba26ea32050fb2a0bcd13fc91482ab0d01bb740fa551363c1a34613b41af369: Status 404 returned error can't find the container with id 4ba26ea32050fb2a0bcd13fc91482ab0d01bb740fa551363c1a34613b41af369 Oct 08 22:37:56 crc kubenswrapper[4834]: I1008 22:37:56.115567 4834 generic.go:334] "Generic (PLEG): container finished" podID="ef56eccf-0b07-4f8d-b7fd-6f9c025fc856" containerID="4bc96462376c7a883818654b86912d1ba2cbcc984a18c822661ca74ce6e9a844" exitCode=0 Oct 08 22:37:56 crc kubenswrapper[4834]: I1008 22:37:56.115699 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" event={"ID":"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856","Type":"ContainerDied","Data":"4bc96462376c7a883818654b86912d1ba2cbcc984a18c822661ca74ce6e9a844"} Oct 08 22:37:56 crc kubenswrapper[4834]: I1008 22:37:56.115922 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" event={"ID":"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856","Type":"ContainerStarted","Data":"4ba26ea32050fb2a0bcd13fc91482ab0d01bb740fa551363c1a34613b41af369"} Oct 08 22:37:59 crc kubenswrapper[4834]: I1008 22:37:59.465249 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:37:59 crc kubenswrapper[4834]: I1008 22:37:59.466250 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:37:59 crc kubenswrapper[4834]: I1008 22:37:59.539310 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:37:59 crc kubenswrapper[4834]: I1008 22:37:59.815791 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-czqvx" Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.151682 4834 generic.go:334] "Generic (PLEG): container finished" podID="ef56eccf-0b07-4f8d-b7fd-6f9c025fc856" containerID="e7493e5177bcb9e3eae69b65955fbc460a08a1fc2f1cfcedbc797da4cfbd8730" exitCode=0 Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.151754 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" event={"ID":"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856","Type":"ContainerDied","Data":"e7493e5177bcb9e3eae69b65955fbc460a08a1fc2f1cfcedbc797da4cfbd8730"} Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.199297 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.342211 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xpl7m"] Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.344218 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.362972 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xpl7m"] Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.425970 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1bb28c5-e0f9-407e-ae86-6ec23348a786-catalog-content\") pod \"community-operators-xpl7m\" (UID: \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\") " pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.426176 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv4xq\" (UniqueName: \"kubernetes.io/projected/c1bb28c5-e0f9-407e-ae86-6ec23348a786-kube-api-access-bv4xq\") pod \"community-operators-xpl7m\" (UID: \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\") " pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.426228 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1bb28c5-e0f9-407e-ae86-6ec23348a786-utilities\") pod \"community-operators-xpl7m\" (UID: \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\") " pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.528330 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv4xq\" (UniqueName: \"kubernetes.io/projected/c1bb28c5-e0f9-407e-ae86-6ec23348a786-kube-api-access-bv4xq\") pod \"community-operators-xpl7m\" (UID: \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\") " pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.528425 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1bb28c5-e0f9-407e-ae86-6ec23348a786-utilities\") pod \"community-operators-xpl7m\" (UID: \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\") " pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.528500 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1bb28c5-e0f9-407e-ae86-6ec23348a786-catalog-content\") pod \"community-operators-xpl7m\" (UID: \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\") " pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.529327 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1bb28c5-e0f9-407e-ae86-6ec23348a786-catalog-content\") pod \"community-operators-xpl7m\" (UID: \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\") " pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.529379 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1bb28c5-e0f9-407e-ae86-6ec23348a786-utilities\") pod \"community-operators-xpl7m\" (UID: \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\") " pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.532934 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-g8nxg" Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.572251 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv4xq\" (UniqueName: \"kubernetes.io/projected/c1bb28c5-e0f9-407e-ae86-6ec23348a786-kube-api-access-bv4xq\") pod \"community-operators-xpl7m\" (UID: \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\") " pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.675266 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:00 crc kubenswrapper[4834]: I1008 22:38:00.987181 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xpl7m"] Oct 08 22:38:01 crc kubenswrapper[4834]: I1008 22:38:01.161407 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpl7m" event={"ID":"c1bb28c5-e0f9-407e-ae86-6ec23348a786","Type":"ContainerStarted","Data":"9b9e2b8352fa0b6a7a7521ff7e4fc8c8c632f5baa84f0b720d9491923a0b922c"} Oct 08 22:38:01 crc kubenswrapper[4834]: I1008 22:38:01.163488 4834 generic.go:334] "Generic (PLEG): container finished" podID="ef56eccf-0b07-4f8d-b7fd-6f9c025fc856" containerID="27789ca46d113630389598727afbeaee5c72ad76ace17f83dafc1b4f44328f18" exitCode=0 Oct 08 22:38:01 crc kubenswrapper[4834]: I1008 22:38:01.163590 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" event={"ID":"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856","Type":"ContainerDied","Data":"27789ca46d113630389598727afbeaee5c72ad76ace17f83dafc1b4f44328f18"} Oct 08 22:38:02 crc kubenswrapper[4834]: I1008 22:38:02.175367 4834 generic.go:334] "Generic (PLEG): container finished" podID="c1bb28c5-e0f9-407e-ae86-6ec23348a786" containerID="597e94ab7461bc9f0f126348eefda40cbd8769f595c13a76adccb97892df379c" exitCode=0 Oct 08 22:38:02 crc kubenswrapper[4834]: I1008 22:38:02.175463 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpl7m" event={"ID":"c1bb28c5-e0f9-407e-ae86-6ec23348a786","Type":"ContainerDied","Data":"597e94ab7461bc9f0f126348eefda40cbd8769f595c13a76adccb97892df379c"} Oct 08 22:38:02 crc kubenswrapper[4834]: I1008 22:38:02.582857 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" Oct 08 22:38:02 crc kubenswrapper[4834]: I1008 22:38:02.667640 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-util\") pod \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\" (UID: \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\") " Oct 08 22:38:02 crc kubenswrapper[4834]: I1008 22:38:02.667789 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-bundle\") pod \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\" (UID: \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\") " Oct 08 22:38:02 crc kubenswrapper[4834]: I1008 22:38:02.668014 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr6lk\" (UniqueName: \"kubernetes.io/projected/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-kube-api-access-gr6lk\") pod \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\" (UID: \"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856\") " Oct 08 22:38:02 crc kubenswrapper[4834]: I1008 22:38:02.669714 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-bundle" (OuterVolumeSpecName: "bundle") pod "ef56eccf-0b07-4f8d-b7fd-6f9c025fc856" (UID: "ef56eccf-0b07-4f8d-b7fd-6f9c025fc856"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:38:02 crc kubenswrapper[4834]: I1008 22:38:02.676281 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-kube-api-access-gr6lk" (OuterVolumeSpecName: "kube-api-access-gr6lk") pod "ef56eccf-0b07-4f8d-b7fd-6f9c025fc856" (UID: "ef56eccf-0b07-4f8d-b7fd-6f9c025fc856"). InnerVolumeSpecName "kube-api-access-gr6lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:38:02 crc kubenswrapper[4834]: I1008 22:38:02.683318 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-util" (OuterVolumeSpecName: "util") pod "ef56eccf-0b07-4f8d-b7fd-6f9c025fc856" (UID: "ef56eccf-0b07-4f8d-b7fd-6f9c025fc856"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:38:02 crc kubenswrapper[4834]: I1008 22:38:02.770437 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr6lk\" (UniqueName: \"kubernetes.io/projected/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-kube-api-access-gr6lk\") on node \"crc\" DevicePath \"\"" Oct 08 22:38:02 crc kubenswrapper[4834]: I1008 22:38:02.770485 4834 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-util\") on node \"crc\" DevicePath \"\"" Oct 08 22:38:02 crc kubenswrapper[4834]: I1008 22:38:02.770502 4834 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef56eccf-0b07-4f8d-b7fd-6f9c025fc856-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:38:03 crc kubenswrapper[4834]: I1008 22:38:03.187644 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpl7m" event={"ID":"c1bb28c5-e0f9-407e-ae86-6ec23348a786","Type":"ContainerStarted","Data":"030a256ac01707bec1974a9929a3acfa086156e0546185dfb97af1b5eac7b69e"} Oct 08 22:38:03 crc kubenswrapper[4834]: I1008 22:38:03.190403 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" event={"ID":"ef56eccf-0b07-4f8d-b7fd-6f9c025fc856","Type":"ContainerDied","Data":"4ba26ea32050fb2a0bcd13fc91482ab0d01bb740fa551363c1a34613b41af369"} Oct 08 22:38:03 crc kubenswrapper[4834]: I1008 22:38:03.190463 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ba26ea32050fb2a0bcd13fc91482ab0d01bb740fa551363c1a34613b41af369" Oct 08 22:38:03 crc kubenswrapper[4834]: I1008 22:38:03.190502 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r" Oct 08 22:38:03 crc kubenswrapper[4834]: I1008 22:38:03.927138 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-89b85"] Oct 08 22:38:03 crc kubenswrapper[4834]: I1008 22:38:03.927508 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-89b85" podUID="503ae774-f984-43c7-ae8d-75c5afe127f9" containerName="registry-server" containerID="cri-o://e0fc03dbefbaa736751a7bd440601f9ba8328d3abe112c7fb73b525609b5c9d4" gracePeriod=2 Oct 08 22:38:04 crc kubenswrapper[4834]: I1008 22:38:04.207302 4834 generic.go:334] "Generic (PLEG): container finished" podID="c1bb28c5-e0f9-407e-ae86-6ec23348a786" containerID="030a256ac01707bec1974a9929a3acfa086156e0546185dfb97af1b5eac7b69e" exitCode=0 Oct 08 22:38:04 crc kubenswrapper[4834]: I1008 22:38:04.207370 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpl7m" event={"ID":"c1bb28c5-e0f9-407e-ae86-6ec23348a786","Type":"ContainerDied","Data":"030a256ac01707bec1974a9929a3acfa086156e0546185dfb97af1b5eac7b69e"} Oct 08 22:38:04 crc kubenswrapper[4834]: I1008 22:38:04.219017 4834 generic.go:334] "Generic (PLEG): container finished" podID="503ae774-f984-43c7-ae8d-75c5afe127f9" containerID="e0fc03dbefbaa736751a7bd440601f9ba8328d3abe112c7fb73b525609b5c9d4" exitCode=0 Oct 08 22:38:04 crc kubenswrapper[4834]: I1008 22:38:04.219071 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89b85" event={"ID":"503ae774-f984-43c7-ae8d-75c5afe127f9","Type":"ContainerDied","Data":"e0fc03dbefbaa736751a7bd440601f9ba8328d3abe112c7fb73b525609b5c9d4"} Oct 08 22:38:04 crc kubenswrapper[4834]: I1008 22:38:04.448119 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:38:04 crc kubenswrapper[4834]: I1008 22:38:04.502403 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/503ae774-f984-43c7-ae8d-75c5afe127f9-catalog-content\") pod \"503ae774-f984-43c7-ae8d-75c5afe127f9\" (UID: \"503ae774-f984-43c7-ae8d-75c5afe127f9\") " Oct 08 22:38:04 crc kubenswrapper[4834]: I1008 22:38:04.502532 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/503ae774-f984-43c7-ae8d-75c5afe127f9-utilities\") pod \"503ae774-f984-43c7-ae8d-75c5afe127f9\" (UID: \"503ae774-f984-43c7-ae8d-75c5afe127f9\") " Oct 08 22:38:04 crc kubenswrapper[4834]: I1008 22:38:04.502580 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkcln\" (UniqueName: \"kubernetes.io/projected/503ae774-f984-43c7-ae8d-75c5afe127f9-kube-api-access-wkcln\") pod \"503ae774-f984-43c7-ae8d-75c5afe127f9\" (UID: \"503ae774-f984-43c7-ae8d-75c5afe127f9\") " Oct 08 22:38:04 crc kubenswrapper[4834]: I1008 22:38:04.503547 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/503ae774-f984-43c7-ae8d-75c5afe127f9-utilities" (OuterVolumeSpecName: "utilities") pod "503ae774-f984-43c7-ae8d-75c5afe127f9" (UID: "503ae774-f984-43c7-ae8d-75c5afe127f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:38:04 crc kubenswrapper[4834]: I1008 22:38:04.507362 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/503ae774-f984-43c7-ae8d-75c5afe127f9-kube-api-access-wkcln" (OuterVolumeSpecName: "kube-api-access-wkcln") pod "503ae774-f984-43c7-ae8d-75c5afe127f9" (UID: "503ae774-f984-43c7-ae8d-75c5afe127f9"). InnerVolumeSpecName "kube-api-access-wkcln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:38:04 crc kubenswrapper[4834]: I1008 22:38:04.527325 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/503ae774-f984-43c7-ae8d-75c5afe127f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "503ae774-f984-43c7-ae8d-75c5afe127f9" (UID: "503ae774-f984-43c7-ae8d-75c5afe127f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:38:04 crc kubenswrapper[4834]: I1008 22:38:04.604057 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/503ae774-f984-43c7-ae8d-75c5afe127f9-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:38:04 crc kubenswrapper[4834]: I1008 22:38:04.604352 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkcln\" (UniqueName: \"kubernetes.io/projected/503ae774-f984-43c7-ae8d-75c5afe127f9-kube-api-access-wkcln\") on node \"crc\" DevicePath \"\"" Oct 08 22:38:04 crc kubenswrapper[4834]: I1008 22:38:04.604444 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/503ae774-f984-43c7-ae8d-75c5afe127f9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:38:05 crc kubenswrapper[4834]: I1008 22:38:05.231355 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpl7m" event={"ID":"c1bb28c5-e0f9-407e-ae86-6ec23348a786","Type":"ContainerStarted","Data":"914a700c476a3ffc75466a2c6486584c8b3c531301396035f77ef1d54b016a3b"} Oct 08 22:38:05 crc kubenswrapper[4834]: I1008 22:38:05.235362 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89b85" event={"ID":"503ae774-f984-43c7-ae8d-75c5afe127f9","Type":"ContainerDied","Data":"a6702befd5950c17f819eec0c6b4714ed18e315e41f5317d532e0680a8f004d3"} Oct 08 22:38:05 crc kubenswrapper[4834]: I1008 22:38:05.235462 4834 scope.go:117] "RemoveContainer" containerID="e0fc03dbefbaa736751a7bd440601f9ba8328d3abe112c7fb73b525609b5c9d4" Oct 08 22:38:05 crc kubenswrapper[4834]: I1008 22:38:05.235501 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89b85" Oct 08 22:38:05 crc kubenswrapper[4834]: I1008 22:38:05.266804 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xpl7m" podStartSLOduration=2.810886091 podStartE2EDuration="5.266780467s" podCreationTimestamp="2025-10-08 22:38:00 +0000 UTC" firstStartedPulling="2025-10-08 22:38:02.178113078 +0000 UTC m=+890.000997864" lastFinishedPulling="2025-10-08 22:38:04.634007454 +0000 UTC m=+892.456892240" observedRunningTime="2025-10-08 22:38:05.259903198 +0000 UTC m=+893.082787974" watchObservedRunningTime="2025-10-08 22:38:05.266780467 +0000 UTC m=+893.089665233" Oct 08 22:38:05 crc kubenswrapper[4834]: I1008 22:38:05.274543 4834 scope.go:117] "RemoveContainer" containerID="494d98b0fb2c50e20e04c11398628a65454c699bf356f0cb1d6e24ca1af36755" Oct 08 22:38:05 crc kubenswrapper[4834]: I1008 22:38:05.292443 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-89b85"] Oct 08 22:38:05 crc kubenswrapper[4834]: I1008 22:38:05.298599 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-89b85"] Oct 08 22:38:05 crc kubenswrapper[4834]: I1008 22:38:05.312480 4834 scope.go:117] "RemoveContainer" containerID="07adf66fd2a156e47764a13c77a4131fb988938409332d2914093e3e03a1e1e4" Oct 08 22:38:05 crc kubenswrapper[4834]: I1008 22:38:05.577251 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="503ae774-f984-43c7-ae8d-75c5afe127f9" path="/var/lib/kubelet/pods/503ae774-f984-43c7-ae8d-75c5afe127f9/volumes" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.014174 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-vpr7z"] Oct 08 22:38:07 crc kubenswrapper[4834]: E1008 22:38:07.014702 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="503ae774-f984-43c7-ae8d-75c5afe127f9" containerName="registry-server" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.014720 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="503ae774-f984-43c7-ae8d-75c5afe127f9" containerName="registry-server" Oct 08 22:38:07 crc kubenswrapper[4834]: E1008 22:38:07.014735 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef56eccf-0b07-4f8d-b7fd-6f9c025fc856" containerName="util" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.014744 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef56eccf-0b07-4f8d-b7fd-6f9c025fc856" containerName="util" Oct 08 22:38:07 crc kubenswrapper[4834]: E1008 22:38:07.014755 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="503ae774-f984-43c7-ae8d-75c5afe127f9" containerName="extract-utilities" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.014765 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="503ae774-f984-43c7-ae8d-75c5afe127f9" containerName="extract-utilities" Oct 08 22:38:07 crc kubenswrapper[4834]: E1008 22:38:07.014799 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef56eccf-0b07-4f8d-b7fd-6f9c025fc856" containerName="extract" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.014808 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef56eccf-0b07-4f8d-b7fd-6f9c025fc856" containerName="extract" Oct 08 22:38:07 crc kubenswrapper[4834]: E1008 22:38:07.014825 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="503ae774-f984-43c7-ae8d-75c5afe127f9" containerName="extract-content" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.014833 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="503ae774-f984-43c7-ae8d-75c5afe127f9" containerName="extract-content" Oct 08 22:38:07 crc kubenswrapper[4834]: E1008 22:38:07.014845 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef56eccf-0b07-4f8d-b7fd-6f9c025fc856" containerName="pull" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.014853 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef56eccf-0b07-4f8d-b7fd-6f9c025fc856" containerName="pull" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.014996 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef56eccf-0b07-4f8d-b7fd-6f9c025fc856" containerName="extract" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.015014 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="503ae774-f984-43c7-ae8d-75c5afe127f9" containerName="registry-server" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.015510 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-vpr7z" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.019295 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.019378 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.019388 4834 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-chccm" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.037852 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44b9h\" (UniqueName: \"kubernetes.io/projected/d078c023-872f-4094-b851-26aad1cbe311-kube-api-access-44b9h\") pod \"cert-manager-operator-controller-manager-57cd46d6d-vpr7z\" (UID: \"d078c023-872f-4094-b851-26aad1cbe311\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-vpr7z" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.040481 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-vpr7z"] Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.141389 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44b9h\" (UniqueName: \"kubernetes.io/projected/d078c023-872f-4094-b851-26aad1cbe311-kube-api-access-44b9h\") pod \"cert-manager-operator-controller-manager-57cd46d6d-vpr7z\" (UID: \"d078c023-872f-4094-b851-26aad1cbe311\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-vpr7z" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.164656 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44b9h\" (UniqueName: \"kubernetes.io/projected/d078c023-872f-4094-b851-26aad1cbe311-kube-api-access-44b9h\") pod \"cert-manager-operator-controller-manager-57cd46d6d-vpr7z\" (UID: \"d078c023-872f-4094-b851-26aad1cbe311\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-vpr7z" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.332671 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-vpr7z" Oct 08 22:38:07 crc kubenswrapper[4834]: I1008 22:38:07.767946 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-vpr7z"] Oct 08 22:38:08 crc kubenswrapper[4834]: I1008 22:38:08.255963 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-vpr7z" event={"ID":"d078c023-872f-4094-b851-26aad1cbe311","Type":"ContainerStarted","Data":"8637882d273afe2fe283a162b66f7b8ae6bfe608b24b1649364d5fad22d6197c"} Oct 08 22:38:10 crc kubenswrapper[4834]: I1008 22:38:10.423606 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-7mpst" Oct 08 22:38:10 crc kubenswrapper[4834]: I1008 22:38:10.676635 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:10 crc kubenswrapper[4834]: I1008 22:38:10.676773 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:10 crc kubenswrapper[4834]: I1008 22:38:10.745989 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:11 crc kubenswrapper[4834]: I1008 22:38:11.326446 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:13 crc kubenswrapper[4834]: I1008 22:38:13.129025 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xpl7m"] Oct 08 22:38:14 crc kubenswrapper[4834]: I1008 22:38:14.293306 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xpl7m" podUID="c1bb28c5-e0f9-407e-ae86-6ec23348a786" containerName="registry-server" containerID="cri-o://914a700c476a3ffc75466a2c6486584c8b3c531301396035f77ef1d54b016a3b" gracePeriod=2 Oct 08 22:38:15 crc kubenswrapper[4834]: I1008 22:38:15.306000 4834 generic.go:334] "Generic (PLEG): container finished" podID="c1bb28c5-e0f9-407e-ae86-6ec23348a786" containerID="914a700c476a3ffc75466a2c6486584c8b3c531301396035f77ef1d54b016a3b" exitCode=0 Oct 08 22:38:15 crc kubenswrapper[4834]: I1008 22:38:15.307081 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpl7m" event={"ID":"c1bb28c5-e0f9-407e-ae86-6ec23348a786","Type":"ContainerDied","Data":"914a700c476a3ffc75466a2c6486584c8b3c531301396035f77ef1d54b016a3b"} Oct 08 22:38:15 crc kubenswrapper[4834]: I1008 22:38:15.599365 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:15 crc kubenswrapper[4834]: I1008 22:38:15.709060 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1bb28c5-e0f9-407e-ae86-6ec23348a786-utilities\") pod \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\" (UID: \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\") " Oct 08 22:38:15 crc kubenswrapper[4834]: I1008 22:38:15.709124 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1bb28c5-e0f9-407e-ae86-6ec23348a786-catalog-content\") pod \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\" (UID: \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\") " Oct 08 22:38:15 crc kubenswrapper[4834]: I1008 22:38:15.709212 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv4xq\" (UniqueName: \"kubernetes.io/projected/c1bb28c5-e0f9-407e-ae86-6ec23348a786-kube-api-access-bv4xq\") pod \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\" (UID: \"c1bb28c5-e0f9-407e-ae86-6ec23348a786\") " Oct 08 22:38:15 crc kubenswrapper[4834]: I1008 22:38:15.710320 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1bb28c5-e0f9-407e-ae86-6ec23348a786-utilities" (OuterVolumeSpecName: "utilities") pod "c1bb28c5-e0f9-407e-ae86-6ec23348a786" (UID: "c1bb28c5-e0f9-407e-ae86-6ec23348a786"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:38:15 crc kubenswrapper[4834]: I1008 22:38:15.710715 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1bb28c5-e0f9-407e-ae86-6ec23348a786-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:38:15 crc kubenswrapper[4834]: I1008 22:38:15.715783 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1bb28c5-e0f9-407e-ae86-6ec23348a786-kube-api-access-bv4xq" (OuterVolumeSpecName: "kube-api-access-bv4xq") pod "c1bb28c5-e0f9-407e-ae86-6ec23348a786" (UID: "c1bb28c5-e0f9-407e-ae86-6ec23348a786"). InnerVolumeSpecName "kube-api-access-bv4xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:38:15 crc kubenswrapper[4834]: I1008 22:38:15.770436 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1bb28c5-e0f9-407e-ae86-6ec23348a786-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1bb28c5-e0f9-407e-ae86-6ec23348a786" (UID: "c1bb28c5-e0f9-407e-ae86-6ec23348a786"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:38:15 crc kubenswrapper[4834]: I1008 22:38:15.812550 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv4xq\" (UniqueName: \"kubernetes.io/projected/c1bb28c5-e0f9-407e-ae86-6ec23348a786-kube-api-access-bv4xq\") on node \"crc\" DevicePath \"\"" Oct 08 22:38:15 crc kubenswrapper[4834]: I1008 22:38:15.812608 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1bb28c5-e0f9-407e-ae86-6ec23348a786-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:38:16 crc kubenswrapper[4834]: I1008 22:38:16.316221 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-vpr7z" event={"ID":"d078c023-872f-4094-b851-26aad1cbe311","Type":"ContainerStarted","Data":"f08762a805dd098d1a5a2819ceaaf2a4dc8ad40321fa0545ffae58f8b25bbb42"} Oct 08 22:38:16 crc kubenswrapper[4834]: I1008 22:38:16.318339 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xpl7m" event={"ID":"c1bb28c5-e0f9-407e-ae86-6ec23348a786","Type":"ContainerDied","Data":"9b9e2b8352fa0b6a7a7521ff7e4fc8c8c632f5baa84f0b720d9491923a0b922c"} Oct 08 22:38:16 crc kubenswrapper[4834]: I1008 22:38:16.318373 4834 scope.go:117] "RemoveContainer" containerID="914a700c476a3ffc75466a2c6486584c8b3c531301396035f77ef1d54b016a3b" Oct 08 22:38:16 crc kubenswrapper[4834]: I1008 22:38:16.318484 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xpl7m" Oct 08 22:38:16 crc kubenswrapper[4834]: I1008 22:38:16.342274 4834 scope.go:117] "RemoveContainer" containerID="030a256ac01707bec1974a9929a3acfa086156e0546185dfb97af1b5eac7b69e" Oct 08 22:38:16 crc kubenswrapper[4834]: I1008 22:38:16.367066 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-vpr7z" podStartSLOduration=2.728537624 podStartE2EDuration="10.36698996s" podCreationTimestamp="2025-10-08 22:38:06 +0000 UTC" firstStartedPulling="2025-10-08 22:38:07.777789946 +0000 UTC m=+895.600674732" lastFinishedPulling="2025-10-08 22:38:15.416242322 +0000 UTC m=+903.239127068" observedRunningTime="2025-10-08 22:38:16.358180974 +0000 UTC m=+904.181065750" watchObservedRunningTime="2025-10-08 22:38:16.36698996 +0000 UTC m=+904.189874756" Oct 08 22:38:16 crc kubenswrapper[4834]: I1008 22:38:16.367934 4834 scope.go:117] "RemoveContainer" containerID="597e94ab7461bc9f0f126348eefda40cbd8769f595c13a76adccb97892df379c" Oct 08 22:38:16 crc kubenswrapper[4834]: I1008 22:38:16.385444 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xpl7m"] Oct 08 22:38:16 crc kubenswrapper[4834]: I1008 22:38:16.389350 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xpl7m"] Oct 08 22:38:17 crc kubenswrapper[4834]: I1008 22:38:17.025791 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:38:17 crc kubenswrapper[4834]: I1008 22:38:17.025882 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:38:17 crc kubenswrapper[4834]: I1008 22:38:17.562410 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1bb28c5-e0f9-407e-ae86-6ec23348a786" path="/var/lib/kubelet/pods/c1bb28c5-e0f9-407e-ae86-6ec23348a786/volumes" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.627601 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-5qfsl"] Oct 08 22:38:20 crc kubenswrapper[4834]: E1008 22:38:20.628244 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1bb28c5-e0f9-407e-ae86-6ec23348a786" containerName="extract-utilities" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.628266 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1bb28c5-e0f9-407e-ae86-6ec23348a786" containerName="extract-utilities" Oct 08 22:38:20 crc kubenswrapper[4834]: E1008 22:38:20.628301 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1bb28c5-e0f9-407e-ae86-6ec23348a786" containerName="extract-content" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.628315 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1bb28c5-e0f9-407e-ae86-6ec23348a786" containerName="extract-content" Oct 08 22:38:20 crc kubenswrapper[4834]: E1008 22:38:20.628332 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1bb28c5-e0f9-407e-ae86-6ec23348a786" containerName="registry-server" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.628345 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1bb28c5-e0f9-407e-ae86-6ec23348a786" containerName="registry-server" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.628539 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1bb28c5-e0f9-407e-ae86-6ec23348a786" containerName="registry-server" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.629183 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-5qfsl" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.632849 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.632871 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.633200 4834 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qnjz8" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.643737 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-5qfsl"] Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.783658 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6wsz\" (UniqueName: \"kubernetes.io/projected/3ebaa304-2d20-49a0-8c2d-46c1f53e94bb-kube-api-access-s6wsz\") pod \"cert-manager-webhook-d969966f-5qfsl\" (UID: \"3ebaa304-2d20-49a0-8c2d-46c1f53e94bb\") " pod="cert-manager/cert-manager-webhook-d969966f-5qfsl" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.783766 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ebaa304-2d20-49a0-8c2d-46c1f53e94bb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-5qfsl\" (UID: \"3ebaa304-2d20-49a0-8c2d-46c1f53e94bb\") " pod="cert-manager/cert-manager-webhook-d969966f-5qfsl" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.884978 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6wsz\" (UniqueName: \"kubernetes.io/projected/3ebaa304-2d20-49a0-8c2d-46c1f53e94bb-kube-api-access-s6wsz\") pod \"cert-manager-webhook-d969966f-5qfsl\" (UID: \"3ebaa304-2d20-49a0-8c2d-46c1f53e94bb\") " pod="cert-manager/cert-manager-webhook-d969966f-5qfsl" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.885038 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ebaa304-2d20-49a0-8c2d-46c1f53e94bb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-5qfsl\" (UID: \"3ebaa304-2d20-49a0-8c2d-46c1f53e94bb\") " pod="cert-manager/cert-manager-webhook-d969966f-5qfsl" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.910615 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6wsz\" (UniqueName: \"kubernetes.io/projected/3ebaa304-2d20-49a0-8c2d-46c1f53e94bb-kube-api-access-s6wsz\") pod \"cert-manager-webhook-d969966f-5qfsl\" (UID: \"3ebaa304-2d20-49a0-8c2d-46c1f53e94bb\") " pod="cert-manager/cert-manager-webhook-d969966f-5qfsl" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.926793 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ebaa304-2d20-49a0-8c2d-46c1f53e94bb-bound-sa-token\") pod \"cert-manager-webhook-d969966f-5qfsl\" (UID: \"3ebaa304-2d20-49a0-8c2d-46c1f53e94bb\") " pod="cert-manager/cert-manager-webhook-d969966f-5qfsl" Oct 08 22:38:20 crc kubenswrapper[4834]: I1008 22:38:20.959236 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-5qfsl" Oct 08 22:38:21 crc kubenswrapper[4834]: I1008 22:38:21.404731 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-5qfsl"] Oct 08 22:38:22 crc kubenswrapper[4834]: I1008 22:38:22.354676 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-5qfsl" event={"ID":"3ebaa304-2d20-49a0-8c2d-46c1f53e94bb","Type":"ContainerStarted","Data":"6e76e0d939f2432ce213fe723e7df81390de430d03e5bb8fde38952886320ef7"} Oct 08 22:38:22 crc kubenswrapper[4834]: I1008 22:38:22.422498 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-lr8nd"] Oct 08 22:38:22 crc kubenswrapper[4834]: I1008 22:38:22.423973 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lr8nd" Oct 08 22:38:22 crc kubenswrapper[4834]: I1008 22:38:22.469712 4834 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-94dl2" Oct 08 22:38:22 crc kubenswrapper[4834]: I1008 22:38:22.471901 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-lr8nd"] Oct 08 22:38:22 crc kubenswrapper[4834]: I1008 22:38:22.609434 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/019cf26b-39c2-4ee8-b93f-6cdb0c3310cd-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-lr8nd\" (UID: \"019cf26b-39c2-4ee8-b93f-6cdb0c3310cd\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lr8nd" Oct 08 22:38:22 crc kubenswrapper[4834]: I1008 22:38:22.609494 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m459m\" (UniqueName: \"kubernetes.io/projected/019cf26b-39c2-4ee8-b93f-6cdb0c3310cd-kube-api-access-m459m\") pod \"cert-manager-cainjector-7d9f95dbf-lr8nd\" (UID: \"019cf26b-39c2-4ee8-b93f-6cdb0c3310cd\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lr8nd" Oct 08 22:38:22 crc kubenswrapper[4834]: I1008 22:38:22.710489 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/019cf26b-39c2-4ee8-b93f-6cdb0c3310cd-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-lr8nd\" (UID: \"019cf26b-39c2-4ee8-b93f-6cdb0c3310cd\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lr8nd" Oct 08 22:38:22 crc kubenswrapper[4834]: I1008 22:38:22.710565 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m459m\" (UniqueName: \"kubernetes.io/projected/019cf26b-39c2-4ee8-b93f-6cdb0c3310cd-kube-api-access-m459m\") pod \"cert-manager-cainjector-7d9f95dbf-lr8nd\" (UID: \"019cf26b-39c2-4ee8-b93f-6cdb0c3310cd\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lr8nd" Oct 08 22:38:22 crc kubenswrapper[4834]: I1008 22:38:22.730272 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/019cf26b-39c2-4ee8-b93f-6cdb0c3310cd-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-lr8nd\" (UID: \"019cf26b-39c2-4ee8-b93f-6cdb0c3310cd\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lr8nd" Oct 08 22:38:22 crc kubenswrapper[4834]: I1008 22:38:22.732934 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m459m\" (UniqueName: \"kubernetes.io/projected/019cf26b-39c2-4ee8-b93f-6cdb0c3310cd-kube-api-access-m459m\") pod \"cert-manager-cainjector-7d9f95dbf-lr8nd\" (UID: \"019cf26b-39c2-4ee8-b93f-6cdb0c3310cd\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lr8nd" Oct 08 22:38:22 crc kubenswrapper[4834]: I1008 22:38:22.786234 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lr8nd" Oct 08 22:38:22 crc kubenswrapper[4834]: I1008 22:38:22.977956 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-lr8nd"] Oct 08 22:38:23 crc kubenswrapper[4834]: I1008 22:38:23.363159 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lr8nd" event={"ID":"019cf26b-39c2-4ee8-b93f-6cdb0c3310cd","Type":"ContainerStarted","Data":"9253fa703feeb75b422362db5d5eb88a1aa7aab0a6af266d4aa7dd85fea03b64"} Oct 08 22:38:26 crc kubenswrapper[4834]: I1008 22:38:26.424129 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lr8nd" event={"ID":"019cf26b-39c2-4ee8-b93f-6cdb0c3310cd","Type":"ContainerStarted","Data":"b96565d72ad246e52d31fb65b5db03910924bf834cbbd63865ba983ae609e017"} Oct 08 22:38:26 crc kubenswrapper[4834]: I1008 22:38:26.427043 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-5qfsl" event={"ID":"3ebaa304-2d20-49a0-8c2d-46c1f53e94bb","Type":"ContainerStarted","Data":"01059c25aba22d3e7d96d05b73576dbb10f827aeecb97ee6314daf00173efa77"} Oct 08 22:38:26 crc kubenswrapper[4834]: I1008 22:38:26.428371 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-5qfsl" Oct 08 22:38:26 crc kubenswrapper[4834]: I1008 22:38:26.458575 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-lr8nd" podStartSLOduration=1.860507229 podStartE2EDuration="4.458538673s" podCreationTimestamp="2025-10-08 22:38:22 +0000 UTC" firstStartedPulling="2025-10-08 22:38:23.000248456 +0000 UTC m=+910.823133222" lastFinishedPulling="2025-10-08 22:38:25.59827992 +0000 UTC m=+913.421164666" observedRunningTime="2025-10-08 22:38:26.450127017 +0000 UTC m=+914.273011813" watchObservedRunningTime="2025-10-08 22:38:26.458538673 +0000 UTC m=+914.281423459" Oct 08 22:38:26 crc kubenswrapper[4834]: I1008 22:38:26.486603 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-5qfsl" podStartSLOduration=2.286102295 podStartE2EDuration="6.4865709s" podCreationTimestamp="2025-10-08 22:38:20 +0000 UTC" firstStartedPulling="2025-10-08 22:38:21.424606482 +0000 UTC m=+909.247491258" lastFinishedPulling="2025-10-08 22:38:25.625075117 +0000 UTC m=+913.447959863" observedRunningTime="2025-10-08 22:38:26.470173759 +0000 UTC m=+914.293058515" watchObservedRunningTime="2025-10-08 22:38:26.4865709 +0000 UTC m=+914.309455676" Oct 08 22:38:30 crc kubenswrapper[4834]: I1008 22:38:30.963926 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-5qfsl" Oct 08 22:38:39 crc kubenswrapper[4834]: I1008 22:38:39.389229 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-g5vrh"] Oct 08 22:38:39 crc kubenswrapper[4834]: I1008 22:38:39.391784 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-g5vrh" Oct 08 22:38:39 crc kubenswrapper[4834]: I1008 22:38:39.396946 4834 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-h77qx" Oct 08 22:38:39 crc kubenswrapper[4834]: I1008 22:38:39.401674 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-g5vrh"] Oct 08 22:38:39 crc kubenswrapper[4834]: I1008 22:38:39.408386 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bzkn\" (UniqueName: \"kubernetes.io/projected/f01ea414-13bf-4228-a97f-32f5810dfd5b-kube-api-access-5bzkn\") pod \"cert-manager-7d4cc89fcb-g5vrh\" (UID: \"f01ea414-13bf-4228-a97f-32f5810dfd5b\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5vrh" Oct 08 22:38:39 crc kubenswrapper[4834]: I1008 22:38:39.408584 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f01ea414-13bf-4228-a97f-32f5810dfd5b-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-g5vrh\" (UID: \"f01ea414-13bf-4228-a97f-32f5810dfd5b\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5vrh" Oct 08 22:38:39 crc kubenswrapper[4834]: I1008 22:38:39.509889 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f01ea414-13bf-4228-a97f-32f5810dfd5b-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-g5vrh\" (UID: \"f01ea414-13bf-4228-a97f-32f5810dfd5b\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5vrh" Oct 08 22:38:39 crc kubenswrapper[4834]: I1008 22:38:39.510124 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bzkn\" (UniqueName: \"kubernetes.io/projected/f01ea414-13bf-4228-a97f-32f5810dfd5b-kube-api-access-5bzkn\") pod \"cert-manager-7d4cc89fcb-g5vrh\" (UID: \"f01ea414-13bf-4228-a97f-32f5810dfd5b\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5vrh" Oct 08 22:38:39 crc kubenswrapper[4834]: I1008 22:38:39.542733 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f01ea414-13bf-4228-a97f-32f5810dfd5b-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-g5vrh\" (UID: \"f01ea414-13bf-4228-a97f-32f5810dfd5b\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5vrh" Oct 08 22:38:39 crc kubenswrapper[4834]: I1008 22:38:39.543203 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bzkn\" (UniqueName: \"kubernetes.io/projected/f01ea414-13bf-4228-a97f-32f5810dfd5b-kube-api-access-5bzkn\") pod \"cert-manager-7d4cc89fcb-g5vrh\" (UID: \"f01ea414-13bf-4228-a97f-32f5810dfd5b\") " pod="cert-manager/cert-manager-7d4cc89fcb-g5vrh" Oct 08 22:38:39 crc kubenswrapper[4834]: I1008 22:38:39.718482 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-g5vrh" Oct 08 22:38:39 crc kubenswrapper[4834]: I1008 22:38:39.997894 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-g5vrh"] Oct 08 22:38:40 crc kubenswrapper[4834]: W1008 22:38:40.009676 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01ea414_13bf_4228_a97f_32f5810dfd5b.slice/crio-e7e924faf6dabc3da1c4f535f163e50f4abad385c9350f7543aa2c664f1107a2 WatchSource:0}: Error finding container e7e924faf6dabc3da1c4f535f163e50f4abad385c9350f7543aa2c664f1107a2: Status 404 returned error can't find the container with id e7e924faf6dabc3da1c4f535f163e50f4abad385c9350f7543aa2c664f1107a2 Oct 08 22:38:40 crc kubenswrapper[4834]: I1008 22:38:40.546634 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-g5vrh" event={"ID":"f01ea414-13bf-4228-a97f-32f5810dfd5b","Type":"ContainerStarted","Data":"75be6a49fa5fc4f2027d9bf3f73180d2ec362e83bdb32afaf9b6886763e0a710"} Oct 08 22:38:40 crc kubenswrapper[4834]: I1008 22:38:40.546757 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-g5vrh" event={"ID":"f01ea414-13bf-4228-a97f-32f5810dfd5b","Type":"ContainerStarted","Data":"e7e924faf6dabc3da1c4f535f163e50f4abad385c9350f7543aa2c664f1107a2"} Oct 08 22:38:40 crc kubenswrapper[4834]: I1008 22:38:40.574407 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-g5vrh" podStartSLOduration=1.574361708 podStartE2EDuration="1.574361708s" podCreationTimestamp="2025-10-08 22:38:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:38:40.56874721 +0000 UTC m=+928.391631966" watchObservedRunningTime="2025-10-08 22:38:40.574361708 +0000 UTC m=+928.397246464" Oct 08 22:38:44 crc kubenswrapper[4834]: I1008 22:38:44.234479 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-fgcvd"] Oct 08 22:38:44 crc kubenswrapper[4834]: I1008 22:38:44.235944 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fgcvd" Oct 08 22:38:44 crc kubenswrapper[4834]: I1008 22:38:44.241776 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 08 22:38:44 crc kubenswrapper[4834]: I1008 22:38:44.241977 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 08 22:38:44 crc kubenswrapper[4834]: I1008 22:38:44.242615 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-cl9d4" Oct 08 22:38:44 crc kubenswrapper[4834]: I1008 22:38:44.253869 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fgcvd"] Oct 08 22:38:44 crc kubenswrapper[4834]: I1008 22:38:44.284506 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp5r5\" (UniqueName: \"kubernetes.io/projected/973cf5c7-1209-49ad-bb4e-02b88d9d2df4-kube-api-access-rp5r5\") pod \"openstack-operator-index-fgcvd\" (UID: \"973cf5c7-1209-49ad-bb4e-02b88d9d2df4\") " pod="openstack-operators/openstack-operator-index-fgcvd" Oct 08 22:38:44 crc kubenswrapper[4834]: I1008 22:38:44.385619 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp5r5\" (UniqueName: \"kubernetes.io/projected/973cf5c7-1209-49ad-bb4e-02b88d9d2df4-kube-api-access-rp5r5\") pod \"openstack-operator-index-fgcvd\" (UID: \"973cf5c7-1209-49ad-bb4e-02b88d9d2df4\") " pod="openstack-operators/openstack-operator-index-fgcvd" Oct 08 22:38:44 crc kubenswrapper[4834]: I1008 22:38:44.403420 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp5r5\" (UniqueName: \"kubernetes.io/projected/973cf5c7-1209-49ad-bb4e-02b88d9d2df4-kube-api-access-rp5r5\") pod \"openstack-operator-index-fgcvd\" (UID: \"973cf5c7-1209-49ad-bb4e-02b88d9d2df4\") " pod="openstack-operators/openstack-operator-index-fgcvd" Oct 08 22:38:44 crc kubenswrapper[4834]: I1008 22:38:44.569406 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fgcvd" Oct 08 22:38:44 crc kubenswrapper[4834]: I1008 22:38:44.822023 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fgcvd"] Oct 08 22:38:44 crc kubenswrapper[4834]: W1008 22:38:44.829953 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod973cf5c7_1209_49ad_bb4e_02b88d9d2df4.slice/crio-28670cfe653ebd241bb4180ddcb2adcc5b5532ee1ef5750a13310cd3627047ca WatchSource:0}: Error finding container 28670cfe653ebd241bb4180ddcb2adcc5b5532ee1ef5750a13310cd3627047ca: Status 404 returned error can't find the container with id 28670cfe653ebd241bb4180ddcb2adcc5b5532ee1ef5750a13310cd3627047ca Oct 08 22:38:45 crc kubenswrapper[4834]: I1008 22:38:45.598793 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fgcvd" event={"ID":"973cf5c7-1209-49ad-bb4e-02b88d9d2df4","Type":"ContainerStarted","Data":"28670cfe653ebd241bb4180ddcb2adcc5b5532ee1ef5750a13310cd3627047ca"} Oct 08 22:38:46 crc kubenswrapper[4834]: I1008 22:38:46.608955 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fgcvd" event={"ID":"973cf5c7-1209-49ad-bb4e-02b88d9d2df4","Type":"ContainerStarted","Data":"2664e9e13100ff9a556f491622b91b153f1bc3324c6698c2fdc611067ea43aea"} Oct 08 22:38:46 crc kubenswrapper[4834]: I1008 22:38:46.630305 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-fgcvd" podStartSLOduration=1.943495564 podStartE2EDuration="2.630285837s" podCreationTimestamp="2025-10-08 22:38:44 +0000 UTC" firstStartedPulling="2025-10-08 22:38:44.833889051 +0000 UTC m=+932.656773817" lastFinishedPulling="2025-10-08 22:38:45.520679344 +0000 UTC m=+933.343564090" observedRunningTime="2025-10-08 22:38:46.627432557 +0000 UTC m=+934.450317333" watchObservedRunningTime="2025-10-08 22:38:46.630285837 +0000 UTC m=+934.453170613" Oct 08 22:38:47 crc kubenswrapper[4834]: I1008 22:38:47.025944 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:38:47 crc kubenswrapper[4834]: I1008 22:38:47.026075 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:38:54 crc kubenswrapper[4834]: I1008 22:38:54.570389 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-fgcvd" Oct 08 22:38:54 crc kubenswrapper[4834]: I1008 22:38:54.571091 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-fgcvd" Oct 08 22:38:54 crc kubenswrapper[4834]: I1008 22:38:54.617835 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-fgcvd" Oct 08 22:38:54 crc kubenswrapper[4834]: I1008 22:38:54.722235 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-fgcvd" Oct 08 22:39:01 crc kubenswrapper[4834]: I1008 22:39:01.486580 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh"] Oct 08 22:39:01 crc kubenswrapper[4834]: I1008 22:39:01.488587 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" Oct 08 22:39:01 crc kubenswrapper[4834]: I1008 22:39:01.491468 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wtnkv" Oct 08 22:39:01 crc kubenswrapper[4834]: I1008 22:39:01.494246 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh"] Oct 08 22:39:01 crc kubenswrapper[4834]: I1008 22:39:01.566788 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d0326a6-118b-492f-98fd-139f5b8fdcf0-util\") pod \"184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh\" (UID: \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\") " pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" Oct 08 22:39:01 crc kubenswrapper[4834]: I1008 22:39:01.566960 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dq9l\" (UniqueName: \"kubernetes.io/projected/1d0326a6-118b-492f-98fd-139f5b8fdcf0-kube-api-access-5dq9l\") pod \"184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh\" (UID: \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\") " pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" Oct 08 22:39:01 crc kubenswrapper[4834]: I1008 22:39:01.567009 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d0326a6-118b-492f-98fd-139f5b8fdcf0-bundle\") pod \"184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh\" (UID: \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\") " pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" Oct 08 22:39:01 crc kubenswrapper[4834]: I1008 22:39:01.668282 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dq9l\" (UniqueName: \"kubernetes.io/projected/1d0326a6-118b-492f-98fd-139f5b8fdcf0-kube-api-access-5dq9l\") pod \"184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh\" (UID: \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\") " pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" Oct 08 22:39:01 crc kubenswrapper[4834]: I1008 22:39:01.668388 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d0326a6-118b-492f-98fd-139f5b8fdcf0-bundle\") pod \"184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh\" (UID: \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\") " pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" Oct 08 22:39:01 crc kubenswrapper[4834]: I1008 22:39:01.668449 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d0326a6-118b-492f-98fd-139f5b8fdcf0-util\") pod \"184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh\" (UID: \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\") " pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" Oct 08 22:39:01 crc kubenswrapper[4834]: I1008 22:39:01.668954 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d0326a6-118b-492f-98fd-139f5b8fdcf0-bundle\") pod \"184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh\" (UID: \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\") " pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" Oct 08 22:39:01 crc kubenswrapper[4834]: I1008 22:39:01.669253 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d0326a6-118b-492f-98fd-139f5b8fdcf0-util\") pod \"184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh\" (UID: \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\") " pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" Oct 08 22:39:01 crc kubenswrapper[4834]: I1008 22:39:01.690742 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dq9l\" (UniqueName: \"kubernetes.io/projected/1d0326a6-118b-492f-98fd-139f5b8fdcf0-kube-api-access-5dq9l\") pod \"184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh\" (UID: \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\") " pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" Oct 08 22:39:01 crc kubenswrapper[4834]: I1008 22:39:01.858128 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" Oct 08 22:39:02 crc kubenswrapper[4834]: I1008 22:39:02.335369 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh"] Oct 08 22:39:02 crc kubenswrapper[4834]: I1008 22:39:02.757548 4834 generic.go:334] "Generic (PLEG): container finished" podID="1d0326a6-118b-492f-98fd-139f5b8fdcf0" containerID="6fc90f29b2e9d6945566bdcd29801e910e499b485a7a59a6d227fe34684d0dc1" exitCode=0 Oct 08 22:39:02 crc kubenswrapper[4834]: I1008 22:39:02.757750 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" event={"ID":"1d0326a6-118b-492f-98fd-139f5b8fdcf0","Type":"ContainerDied","Data":"6fc90f29b2e9d6945566bdcd29801e910e499b485a7a59a6d227fe34684d0dc1"} Oct 08 22:39:02 crc kubenswrapper[4834]: I1008 22:39:02.758203 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" event={"ID":"1d0326a6-118b-492f-98fd-139f5b8fdcf0","Type":"ContainerStarted","Data":"875cee04653a7d64e54bbb5e003859dd8d3e1529621b9ac40dba766652d30665"} Oct 08 22:39:03 crc kubenswrapper[4834]: I1008 22:39:03.770837 4834 generic.go:334] "Generic (PLEG): container finished" podID="1d0326a6-118b-492f-98fd-139f5b8fdcf0" containerID="72e7bc43cc1d27a4476df22647e1845d164053538b29c8cc6e176cdddd5c04d6" exitCode=0 Oct 08 22:39:03 crc kubenswrapper[4834]: I1008 22:39:03.770921 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" event={"ID":"1d0326a6-118b-492f-98fd-139f5b8fdcf0","Type":"ContainerDied","Data":"72e7bc43cc1d27a4476df22647e1845d164053538b29c8cc6e176cdddd5c04d6"} Oct 08 22:39:04 crc kubenswrapper[4834]: I1008 22:39:04.784389 4834 generic.go:334] "Generic (PLEG): container finished" podID="1d0326a6-118b-492f-98fd-139f5b8fdcf0" containerID="c8adad8b9759761727b2705c2293e649d3193f7d88f1c62146b88ef9f6a9be72" exitCode=0 Oct 08 22:39:04 crc kubenswrapper[4834]: I1008 22:39:04.784471 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" event={"ID":"1d0326a6-118b-492f-98fd-139f5b8fdcf0","Type":"ContainerDied","Data":"c8adad8b9759761727b2705c2293e649d3193f7d88f1c62146b88ef9f6a9be72"} Oct 08 22:39:06 crc kubenswrapper[4834]: I1008 22:39:06.078442 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" Oct 08 22:39:06 crc kubenswrapper[4834]: I1008 22:39:06.242127 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d0326a6-118b-492f-98fd-139f5b8fdcf0-util\") pod \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\" (UID: \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\") " Oct 08 22:39:06 crc kubenswrapper[4834]: I1008 22:39:06.242486 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dq9l\" (UniqueName: \"kubernetes.io/projected/1d0326a6-118b-492f-98fd-139f5b8fdcf0-kube-api-access-5dq9l\") pod \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\" (UID: \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\") " Oct 08 22:39:06 crc kubenswrapper[4834]: I1008 22:39:06.242564 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d0326a6-118b-492f-98fd-139f5b8fdcf0-bundle\") pod \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\" (UID: \"1d0326a6-118b-492f-98fd-139f5b8fdcf0\") " Oct 08 22:39:06 crc kubenswrapper[4834]: I1008 22:39:06.243360 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d0326a6-118b-492f-98fd-139f5b8fdcf0-bundle" (OuterVolumeSpecName: "bundle") pod "1d0326a6-118b-492f-98fd-139f5b8fdcf0" (UID: "1d0326a6-118b-492f-98fd-139f5b8fdcf0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:39:06 crc kubenswrapper[4834]: I1008 22:39:06.249401 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d0326a6-118b-492f-98fd-139f5b8fdcf0-kube-api-access-5dq9l" (OuterVolumeSpecName: "kube-api-access-5dq9l") pod "1d0326a6-118b-492f-98fd-139f5b8fdcf0" (UID: "1d0326a6-118b-492f-98fd-139f5b8fdcf0"). InnerVolumeSpecName "kube-api-access-5dq9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:39:06 crc kubenswrapper[4834]: I1008 22:39:06.261200 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d0326a6-118b-492f-98fd-139f5b8fdcf0-util" (OuterVolumeSpecName: "util") pod "1d0326a6-118b-492f-98fd-139f5b8fdcf0" (UID: "1d0326a6-118b-492f-98fd-139f5b8fdcf0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:39:06 crc kubenswrapper[4834]: I1008 22:39:06.344182 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dq9l\" (UniqueName: \"kubernetes.io/projected/1d0326a6-118b-492f-98fd-139f5b8fdcf0-kube-api-access-5dq9l\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:06 crc kubenswrapper[4834]: I1008 22:39:06.344229 4834 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d0326a6-118b-492f-98fd-139f5b8fdcf0-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:06 crc kubenswrapper[4834]: I1008 22:39:06.344241 4834 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d0326a6-118b-492f-98fd-139f5b8fdcf0-util\") on node \"crc\" DevicePath \"\"" Oct 08 22:39:06 crc kubenswrapper[4834]: I1008 22:39:06.805885 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" event={"ID":"1d0326a6-118b-492f-98fd-139f5b8fdcf0","Type":"ContainerDied","Data":"875cee04653a7d64e54bbb5e003859dd8d3e1529621b9ac40dba766652d30665"} Oct 08 22:39:06 crc kubenswrapper[4834]: I1008 22:39:06.805940 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="875cee04653a7d64e54bbb5e003859dd8d3e1529621b9ac40dba766652d30665" Oct 08 22:39:06 crc kubenswrapper[4834]: I1008 22:39:06.806022 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh" Oct 08 22:39:14 crc kubenswrapper[4834]: I1008 22:39:14.058242 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-848c57cb5c-dclqm"] Oct 08 22:39:14 crc kubenswrapper[4834]: E1008 22:39:14.059885 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0326a6-118b-492f-98fd-139f5b8fdcf0" containerName="extract" Oct 08 22:39:14 crc kubenswrapper[4834]: I1008 22:39:14.060019 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0326a6-118b-492f-98fd-139f5b8fdcf0" containerName="extract" Oct 08 22:39:14 crc kubenswrapper[4834]: E1008 22:39:14.060075 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0326a6-118b-492f-98fd-139f5b8fdcf0" containerName="pull" Oct 08 22:39:14 crc kubenswrapper[4834]: I1008 22:39:14.060134 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0326a6-118b-492f-98fd-139f5b8fdcf0" containerName="pull" Oct 08 22:39:14 crc kubenswrapper[4834]: E1008 22:39:14.060227 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d0326a6-118b-492f-98fd-139f5b8fdcf0" containerName="util" Oct 08 22:39:14 crc kubenswrapper[4834]: I1008 22:39:14.060301 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d0326a6-118b-492f-98fd-139f5b8fdcf0" containerName="util" Oct 08 22:39:14 crc kubenswrapper[4834]: I1008 22:39:14.060465 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d0326a6-118b-492f-98fd-139f5b8fdcf0" containerName="extract" Oct 08 22:39:14 crc kubenswrapper[4834]: I1008 22:39:14.061116 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-848c57cb5c-dclqm" Oct 08 22:39:14 crc kubenswrapper[4834]: I1008 22:39:14.065547 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-28j89" Oct 08 22:39:14 crc kubenswrapper[4834]: I1008 22:39:14.087917 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-848c57cb5c-dclqm"] Oct 08 22:39:14 crc kubenswrapper[4834]: I1008 22:39:14.164663 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnhfw\" (UniqueName: \"kubernetes.io/projected/3c4ac3ad-ab59-48d1-bfcd-ced9b96a5f8a-kube-api-access-lnhfw\") pod \"openstack-operator-controller-operator-848c57cb5c-dclqm\" (UID: \"3c4ac3ad-ab59-48d1-bfcd-ced9b96a5f8a\") " pod="openstack-operators/openstack-operator-controller-operator-848c57cb5c-dclqm" Oct 08 22:39:14 crc kubenswrapper[4834]: I1008 22:39:14.266281 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnhfw\" (UniqueName: \"kubernetes.io/projected/3c4ac3ad-ab59-48d1-bfcd-ced9b96a5f8a-kube-api-access-lnhfw\") pod \"openstack-operator-controller-operator-848c57cb5c-dclqm\" (UID: \"3c4ac3ad-ab59-48d1-bfcd-ced9b96a5f8a\") " pod="openstack-operators/openstack-operator-controller-operator-848c57cb5c-dclqm" Oct 08 22:39:14 crc kubenswrapper[4834]: I1008 22:39:14.289189 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnhfw\" (UniqueName: \"kubernetes.io/projected/3c4ac3ad-ab59-48d1-bfcd-ced9b96a5f8a-kube-api-access-lnhfw\") pod \"openstack-operator-controller-operator-848c57cb5c-dclqm\" (UID: \"3c4ac3ad-ab59-48d1-bfcd-ced9b96a5f8a\") " pod="openstack-operators/openstack-operator-controller-operator-848c57cb5c-dclqm" Oct 08 22:39:14 crc kubenswrapper[4834]: I1008 22:39:14.376529 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-848c57cb5c-dclqm" Oct 08 22:39:14 crc kubenswrapper[4834]: I1008 22:39:14.709900 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-848c57cb5c-dclqm"] Oct 08 22:39:14 crc kubenswrapper[4834]: I1008 22:39:14.878724 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-848c57cb5c-dclqm" event={"ID":"3c4ac3ad-ab59-48d1-bfcd-ced9b96a5f8a","Type":"ContainerStarted","Data":"e2849e66d21e6b160e6eb1780f0709a89deafc46ec973bc2233dc35409d709f4"} Oct 08 22:39:17 crc kubenswrapper[4834]: I1008 22:39:17.024986 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:39:17 crc kubenswrapper[4834]: I1008 22:39:17.025463 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:39:17 crc kubenswrapper[4834]: I1008 22:39:17.025515 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:39:17 crc kubenswrapper[4834]: I1008 22:39:17.026188 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c4baa6db4e38cdb0c141b99f90c2bb4b5f7f47f94b55109a60fc26c2c73b21d9"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:39:17 crc kubenswrapper[4834]: I1008 22:39:17.026293 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://c4baa6db4e38cdb0c141b99f90c2bb4b5f7f47f94b55109a60fc26c2c73b21d9" gracePeriod=600 Oct 08 22:39:17 crc kubenswrapper[4834]: I1008 22:39:17.918747 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="c4baa6db4e38cdb0c141b99f90c2bb4b5f7f47f94b55109a60fc26c2c73b21d9" exitCode=0 Oct 08 22:39:17 crc kubenswrapper[4834]: I1008 22:39:17.918974 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"c4baa6db4e38cdb0c141b99f90c2bb4b5f7f47f94b55109a60fc26c2c73b21d9"} Oct 08 22:39:17 crc kubenswrapper[4834]: I1008 22:39:17.919464 4834 scope.go:117] "RemoveContainer" containerID="52a93a6fe63650f6a03b209142df1e3ce01f805d96d6142f73c9c419354c0aca" Oct 08 22:39:19 crc kubenswrapper[4834]: I1008 22:39:19.939107 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-848c57cb5c-dclqm" event={"ID":"3c4ac3ad-ab59-48d1-bfcd-ced9b96a5f8a","Type":"ContainerStarted","Data":"95eccaed684c022248520d8fd705398a6419e631feaa149e624085547e0d7fbc"} Oct 08 22:39:19 crc kubenswrapper[4834]: I1008 22:39:19.947623 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"ae7fc299ba63d30578a076e14832e2ba3dd0a6f32f375b1c858285b17f026ca6"} Oct 08 22:39:21 crc kubenswrapper[4834]: I1008 22:39:21.967960 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-848c57cb5c-dclqm" event={"ID":"3c4ac3ad-ab59-48d1-bfcd-ced9b96a5f8a","Type":"ContainerStarted","Data":"b00a530af802c713b5bf9e57894bfe12f6bde4dc9ab10345281d14e04afec427"} Oct 08 22:39:21 crc kubenswrapper[4834]: I1008 22:39:21.968881 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-848c57cb5c-dclqm" Oct 08 22:39:22 crc kubenswrapper[4834]: I1008 22:39:22.028053 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-848c57cb5c-dclqm" podStartSLOduration=1.28923429 podStartE2EDuration="8.028034103s" podCreationTimestamp="2025-10-08 22:39:14 +0000 UTC" firstStartedPulling="2025-10-08 22:39:14.723454214 +0000 UTC m=+962.546338960" lastFinishedPulling="2025-10-08 22:39:21.462254027 +0000 UTC m=+969.285138773" observedRunningTime="2025-10-08 22:39:22.018966001 +0000 UTC m=+969.841850787" watchObservedRunningTime="2025-10-08 22:39:22.028034103 +0000 UTC m=+969.850918859" Oct 08 22:39:24 crc kubenswrapper[4834]: I1008 22:39:24.380504 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-848c57cb5c-dclqm" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.560799 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-dvbzh"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.562262 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-dvbzh" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.564320 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-jsb66" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.578203 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-dvbzh"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.581270 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-8fbc8"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.582166 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-8fbc8" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.584078 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-sm6lm" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.602958 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-8fbc8"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.606718 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-8fhdd"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.608511 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8fhdd" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.611033 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-7pp4l" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.624787 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-nf4vz"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.625816 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-nf4vz" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.627380 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-78lk2" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.644640 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-8fhdd"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.647921 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-nf4vz"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.665855 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7lbb\" (UniqueName: \"kubernetes.io/projected/6dbd0034-b992-4a64-ab92-268abe380d03-kube-api-access-h7lbb\") pod \"cinder-operator-controller-manager-7b7fb68549-8fbc8\" (UID: \"6dbd0034-b992-4a64-ab92-268abe380d03\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-8fbc8" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.665901 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzmf7\" (UniqueName: \"kubernetes.io/projected/775cc910-4ea9-4da1-b35a-a31b4c880010-kube-api-access-dzmf7\") pod \"designate-operator-controller-manager-85d5d9dd78-8fhdd\" (UID: \"775cc910-4ea9-4da1-b35a-a31b4c880010\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8fhdd" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.666010 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb2cd\" (UniqueName: \"kubernetes.io/projected/864591d4-af96-44e6-8a1f-a01bf0b9fb44-kube-api-access-zb2cd\") pod \"barbican-operator-controller-manager-658bdf4b74-dvbzh\" (UID: \"864591d4-af96-44e6-8a1f-a01bf0b9fb44\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-dvbzh" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.667218 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98ztc\" (UniqueName: \"kubernetes.io/projected/4f3521e9-05df-408a-a765-7a7ba0046afa-kube-api-access-98ztc\") pod \"glance-operator-controller-manager-84b9b84486-nf4vz\" (UID: \"4f3521e9-05df-408a-a765-7a7ba0046afa\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-nf4vz" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.669851 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-xprwf"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.670808 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xprwf" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.672356 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-gbxvq" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.678465 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-lwsbk"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.679382 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-lwsbk" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.684775 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-2stzz" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.700131 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-xprwf"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.704842 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.705973 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.711315 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-ssskr"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.712357 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ssskr" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.713112 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-ht6pt" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.713331 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.716400 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-h9h25" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.735589 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-lwsbk"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.752197 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.766370 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-ssskr"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.768496 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m2f9\" (UniqueName: \"kubernetes.io/projected/5d7e03d9-baa9-4867-9fbc-91a82a36f4e2-kube-api-access-2m2f9\") pod \"horizon-operator-controller-manager-7ffbcb7588-lwsbk\" (UID: \"5d7e03d9-baa9-4867-9fbc-91a82a36f4e2\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-lwsbk" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.768537 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrtjs\" (UniqueName: \"kubernetes.io/projected/b886f9b3-c296-4445-9a60-cb6809463741-kube-api-access-mrtjs\") pod \"heat-operator-controller-manager-858f76bbdd-xprwf\" (UID: \"b886f9b3-c296-4445-9a60-cb6809463741\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xprwf" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.768561 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa45e46-72a6-4f2f-af9e-ce679038b8f1-cert\") pod \"infra-operator-controller-manager-656bcbd775-mxst7\" (UID: \"baa45e46-72a6-4f2f-af9e-ce679038b8f1\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.768594 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb2cd\" (UniqueName: \"kubernetes.io/projected/864591d4-af96-44e6-8a1f-a01bf0b9fb44-kube-api-access-zb2cd\") pod \"barbican-operator-controller-manager-658bdf4b74-dvbzh\" (UID: \"864591d4-af96-44e6-8a1f-a01bf0b9fb44\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-dvbzh" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.768631 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d49j\" (UniqueName: \"kubernetes.io/projected/baa45e46-72a6-4f2f-af9e-ce679038b8f1-kube-api-access-6d49j\") pod \"infra-operator-controller-manager-656bcbd775-mxst7\" (UID: \"baa45e46-72a6-4f2f-af9e-ce679038b8f1\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.768661 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98ztc\" (UniqueName: \"kubernetes.io/projected/4f3521e9-05df-408a-a765-7a7ba0046afa-kube-api-access-98ztc\") pod \"glance-operator-controller-manager-84b9b84486-nf4vz\" (UID: \"4f3521e9-05df-408a-a765-7a7ba0046afa\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-nf4vz" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.768693 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7lbb\" (UniqueName: \"kubernetes.io/projected/6dbd0034-b992-4a64-ab92-268abe380d03-kube-api-access-h7lbb\") pod \"cinder-operator-controller-manager-7b7fb68549-8fbc8\" (UID: \"6dbd0034-b992-4a64-ab92-268abe380d03\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-8fbc8" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.768709 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzmf7\" (UniqueName: \"kubernetes.io/projected/775cc910-4ea9-4da1-b35a-a31b4c880010-kube-api-access-dzmf7\") pod \"designate-operator-controller-manager-85d5d9dd78-8fhdd\" (UID: \"775cc910-4ea9-4da1-b35a-a31b4c880010\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8fhdd" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.768731 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jxsk\" (UniqueName: \"kubernetes.io/projected/84427f76-6342-4e6b-9875-56b2d3db0fac-kube-api-access-7jxsk\") pod \"ironic-operator-controller-manager-9c5c78d49-ssskr\" (UID: \"84427f76-6342-4e6b-9875-56b2d3db0fac\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ssskr" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.773999 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-z5gzx"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.775275 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-z5gzx" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.777455 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-kpkjr" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.784811 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-q2qld"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.785862 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-q2qld" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.808053 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98ztc\" (UniqueName: \"kubernetes.io/projected/4f3521e9-05df-408a-a765-7a7ba0046afa-kube-api-access-98ztc\") pod \"glance-operator-controller-manager-84b9b84486-nf4vz\" (UID: \"4f3521e9-05df-408a-a765-7a7ba0046afa\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-nf4vz" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.811600 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-sgc69" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.811755 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-z5gzx"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.817808 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzmf7\" (UniqueName: \"kubernetes.io/projected/775cc910-4ea9-4da1-b35a-a31b4c880010-kube-api-access-dzmf7\") pod \"designate-operator-controller-manager-85d5d9dd78-8fhdd\" (UID: \"775cc910-4ea9-4da1-b35a-a31b4c880010\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8fhdd" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.824883 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7lbb\" (UniqueName: \"kubernetes.io/projected/6dbd0034-b992-4a64-ab92-268abe380d03-kube-api-access-h7lbb\") pod \"cinder-operator-controller-manager-7b7fb68549-8fbc8\" (UID: \"6dbd0034-b992-4a64-ab92-268abe380d03\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-8fbc8" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.832236 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb2cd\" (UniqueName: \"kubernetes.io/projected/864591d4-af96-44e6-8a1f-a01bf0b9fb44-kube-api-access-zb2cd\") pod \"barbican-operator-controller-manager-658bdf4b74-dvbzh\" (UID: \"864591d4-af96-44e6-8a1f-a01bf0b9fb44\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-dvbzh" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.883454 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-q2qld"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.886857 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-dvbzh" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.888278 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d49j\" (UniqueName: \"kubernetes.io/projected/baa45e46-72a6-4f2f-af9e-ce679038b8f1-kube-api-access-6d49j\") pod \"infra-operator-controller-manager-656bcbd775-mxst7\" (UID: \"baa45e46-72a6-4f2f-af9e-ce679038b8f1\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.888339 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jxsk\" (UniqueName: \"kubernetes.io/projected/84427f76-6342-4e6b-9875-56b2d3db0fac-kube-api-access-7jxsk\") pod \"ironic-operator-controller-manager-9c5c78d49-ssskr\" (UID: \"84427f76-6342-4e6b-9875-56b2d3db0fac\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ssskr" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.888367 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m2f9\" (UniqueName: \"kubernetes.io/projected/5d7e03d9-baa9-4867-9fbc-91a82a36f4e2-kube-api-access-2m2f9\") pod \"horizon-operator-controller-manager-7ffbcb7588-lwsbk\" (UID: \"5d7e03d9-baa9-4867-9fbc-91a82a36f4e2\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-lwsbk" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.888384 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrtjs\" (UniqueName: \"kubernetes.io/projected/b886f9b3-c296-4445-9a60-cb6809463741-kube-api-access-mrtjs\") pod \"heat-operator-controller-manager-858f76bbdd-xprwf\" (UID: \"b886f9b3-c296-4445-9a60-cb6809463741\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xprwf" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.888405 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa45e46-72a6-4f2f-af9e-ce679038b8f1-cert\") pod \"infra-operator-controller-manager-656bcbd775-mxst7\" (UID: \"baa45e46-72a6-4f2f-af9e-ce679038b8f1\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" Oct 08 22:39:40 crc kubenswrapper[4834]: E1008 22:39:40.888513 4834 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 08 22:39:40 crc kubenswrapper[4834]: E1008 22:39:40.888556 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/baa45e46-72a6-4f2f-af9e-ce679038b8f1-cert podName:baa45e46-72a6-4f2f-af9e-ce679038b8f1 nodeName:}" failed. No retries permitted until 2025-10-08 22:39:41.388540178 +0000 UTC m=+989.211424914 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/baa45e46-72a6-4f2f-af9e-ce679038b8f1-cert") pod "infra-operator-controller-manager-656bcbd775-mxst7" (UID: "baa45e46-72a6-4f2f-af9e-ce679038b8f1") : secret "infra-operator-webhook-server-cert" not found Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.897261 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-7mh2r"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.898800 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-7mh2r" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.901483 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-8fbc8" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.903249 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-dmx2m" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.926112 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8fhdd" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.931353 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d49j\" (UniqueName: \"kubernetes.io/projected/baa45e46-72a6-4f2f-af9e-ce679038b8f1-kube-api-access-6d49j\") pod \"infra-operator-controller-manager-656bcbd775-mxst7\" (UID: \"baa45e46-72a6-4f2f-af9e-ce679038b8f1\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.931684 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jxsk\" (UniqueName: \"kubernetes.io/projected/84427f76-6342-4e6b-9875-56b2d3db0fac-kube-api-access-7jxsk\") pod \"ironic-operator-controller-manager-9c5c78d49-ssskr\" (UID: \"84427f76-6342-4e6b-9875-56b2d3db0fac\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ssskr" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.940666 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-nf4vz" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.945761 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrtjs\" (UniqueName: \"kubernetes.io/projected/b886f9b3-c296-4445-9a60-cb6809463741-kube-api-access-mrtjs\") pod \"heat-operator-controller-manager-858f76bbdd-xprwf\" (UID: \"b886f9b3-c296-4445-9a60-cb6809463741\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xprwf" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.945876 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-7mh2r"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.953863 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m2f9\" (UniqueName: \"kubernetes.io/projected/5d7e03d9-baa9-4867-9fbc-91a82a36f4e2-kube-api-access-2m2f9\") pod \"horizon-operator-controller-manager-7ffbcb7588-lwsbk\" (UID: \"5d7e03d9-baa9-4867-9fbc-91a82a36f4e2\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-lwsbk" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.979462 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-rw77z"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.980874 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-rw77z" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.990054 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clng2\" (UniqueName: \"kubernetes.io/projected/306dc2e6-3f9b-45c6-b615-75a6d2098857-kube-api-access-clng2\") pod \"keystone-operator-controller-manager-55b6b7c7b8-z5gzx\" (UID: \"306dc2e6-3f9b-45c6-b615-75a6d2098857\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-z5gzx" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.990134 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r62lv\" (UniqueName: \"kubernetes.io/projected/5452435d-906d-4f08-87e9-168187eb4d5c-kube-api-access-r62lv\") pod \"manila-operator-controller-manager-5f67fbc655-q2qld\" (UID: \"5452435d-906d-4f08-87e9-168187eb4d5c\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-q2qld" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.991065 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-tt5t9" Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.991355 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk"] Oct 08 22:39:40 crc kubenswrapper[4834]: I1008 22:39:40.992305 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.004651 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-2g76m" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.008498 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-jkmf4"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.009987 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-jkmf4" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.014033 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8765r" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.022477 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.022900 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xprwf" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.028680 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-rw77z"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.033100 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-lwsbk" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.033998 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-jkmf4"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.040208 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.041286 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.043757 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-4pf57" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.044257 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.058356 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79db49b9fb-q9bmt"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.060022 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-q9bmt" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.063250 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2r754" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.068577 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ssskr" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.074186 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-6r5r9"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.075311 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-6r5r9" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.079951 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-kdkg5" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.083841 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.091521 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79db49b9fb-q9bmt"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.092283 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r62lv\" (UniqueName: \"kubernetes.io/projected/5452435d-906d-4f08-87e9-168187eb4d5c-kube-api-access-r62lv\") pod \"manila-operator-controller-manager-5f67fbc655-q2qld\" (UID: \"5452435d-906d-4f08-87e9-168187eb4d5c\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-q2qld" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.092350 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl4mf\" (UniqueName: \"kubernetes.io/projected/6a282f5d-5a52-4a80-bf82-6eea47007564-kube-api-access-tl4mf\") pod \"nova-operator-controller-manager-5df598886f-rw77z\" (UID: \"6a282f5d-5a52-4a80-bf82-6eea47007564\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-rw77z" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.092399 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzw6x\" (UniqueName: \"kubernetes.io/projected/0cdb778f-1017-4f86-9443-85d7bf158bd6-kube-api-access-bzw6x\") pod \"neutron-operator-controller-manager-79d585cb66-jkmf4\" (UID: \"0cdb778f-1017-4f86-9443-85d7bf158bd6\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-jkmf4" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.092419 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clng2\" (UniqueName: \"kubernetes.io/projected/306dc2e6-3f9b-45c6-b615-75a6d2098857-kube-api-access-clng2\") pod \"keystone-operator-controller-manager-55b6b7c7b8-z5gzx\" (UID: \"306dc2e6-3f9b-45c6-b615-75a6d2098857\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-z5gzx" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.092442 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbbg4\" (UniqueName: \"kubernetes.io/projected/cf7ce4de-5f9c-47c3-a01b-2e23e1e27a82-kube-api-access-vbbg4\") pod \"mariadb-operator-controller-manager-f9fb45f8f-7mh2r\" (UID: \"cf7ce4de-5f9c-47c3-a01b-2e23e1e27a82\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-7mh2r" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.117292 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.118227 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clng2\" (UniqueName: \"kubernetes.io/projected/306dc2e6-3f9b-45c6-b615-75a6d2098857-kube-api-access-clng2\") pod \"keystone-operator-controller-manager-55b6b7c7b8-z5gzx\" (UID: \"306dc2e6-3f9b-45c6-b615-75a6d2098857\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-z5gzx" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.118407 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.120927 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kh5k2" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.123976 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r62lv\" (UniqueName: \"kubernetes.io/projected/5452435d-906d-4f08-87e9-168187eb4d5c-kube-api-access-r62lv\") pod \"manila-operator-controller-manager-5f67fbc655-q2qld\" (UID: \"5452435d-906d-4f08-87e9-168187eb4d5c\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-q2qld" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.129208 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-6r5r9"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.136753 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.138770 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.141534 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-bgnxr" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.142428 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.150926 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.186482 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-phcvn"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.188588 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5458f77c4-phcvn" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.191091 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-67kf2" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.191574 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-z5gzx" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.193430 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl4mf\" (UniqueName: \"kubernetes.io/projected/6a282f5d-5a52-4a80-bf82-6eea47007564-kube-api-access-tl4mf\") pod \"nova-operator-controller-manager-5df598886f-rw77z\" (UID: \"6a282f5d-5a52-4a80-bf82-6eea47007564\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-rw77z" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.193507 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86ns4\" (UniqueName: \"kubernetes.io/projected/f8aea845-c4c0-469f-ac39-9f4525e69ec5-kube-api-access-86ns4\") pod \"octavia-operator-controller-manager-69fdcfc5f5-zqxbk\" (UID: \"f8aea845-c4c0-469f-ac39-9f4525e69ec5\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.193542 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzw6x\" (UniqueName: \"kubernetes.io/projected/0cdb778f-1017-4f86-9443-85d7bf158bd6-kube-api-access-bzw6x\") pod \"neutron-operator-controller-manager-79d585cb66-jkmf4\" (UID: \"0cdb778f-1017-4f86-9443-85d7bf158bd6\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-jkmf4" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.193576 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbbg4\" (UniqueName: \"kubernetes.io/projected/cf7ce4de-5f9c-47c3-a01b-2e23e1e27a82-kube-api-access-vbbg4\") pod \"mariadb-operator-controller-manager-f9fb45f8f-7mh2r\" (UID: \"cf7ce4de-5f9c-47c3-a01b-2e23e1e27a82\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-7mh2r" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.193593 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnxzq\" (UniqueName: \"kubernetes.io/projected/8e0f70ea-9d3e-4ada-834a-1134f8485204-kube-api-access-mnxzq\") pod \"test-operator-controller-manager-5458f77c4-phcvn\" (UID: \"8e0f70ea-9d3e-4ada-834a-1134f8485204\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-phcvn" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.193619 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lpqh\" (UniqueName: \"kubernetes.io/projected/6c2fa94f-b7fc-496b-a2f4-81695f1d86b2-kube-api-access-6lpqh\") pod \"telemetry-operator-controller-manager-67cfc6749b-7lzg8\" (UID: \"6c2fa94f-b7fc-496b-a2f4-81695f1d86b2\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.193636 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxj8m\" (UniqueName: \"kubernetes.io/projected/d062b1e7-96b9-48e0-acb3-528dc3c7e59c-kube-api-access-sxj8m\") pod \"ovn-operator-controller-manager-79db49b9fb-q9bmt\" (UID: \"d062b1e7-96b9-48e0-acb3-528dc3c7e59c\") " pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-q9bmt" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.193658 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbvrp\" (UniqueName: \"kubernetes.io/projected/95926917-39a4-4757-b246-874a680f97ce-kube-api-access-dbvrp\") pod \"swift-operator-controller-manager-db6d7f97b-vxg74\" (UID: \"95926917-39a4-4757-b246-874a680f97ce\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.193686 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6mxx\" (UniqueName: \"kubernetes.io/projected/3ffe8d9e-bb57-4156-8eff-fdb070a67e6d-kube-api-access-g6mxx\") pod \"placement-operator-controller-manager-68b6c87b68-6r5r9\" (UID: \"3ffe8d9e-bb57-4156-8eff-fdb070a67e6d\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-6r5r9" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.193705 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fr2p\" (UniqueName: \"kubernetes.io/projected/96bb0f12-b144-4f24-9b48-407519d51c6e-kube-api-access-2fr2p\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm\" (UID: \"96bb0f12-b144-4f24-9b48-407519d51c6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.193723 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96bb0f12-b144-4f24-9b48-407519d51c6e-cert\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm\" (UID: \"96bb0f12-b144-4f24-9b48-407519d51c6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.205289 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-phcvn"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.223138 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-q2qld" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.223191 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzw6x\" (UniqueName: \"kubernetes.io/projected/0cdb778f-1017-4f86-9443-85d7bf158bd6-kube-api-access-bzw6x\") pod \"neutron-operator-controller-manager-79d585cb66-jkmf4\" (UID: \"0cdb778f-1017-4f86-9443-85d7bf158bd6\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-jkmf4" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.231426 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbbg4\" (UniqueName: \"kubernetes.io/projected/cf7ce4de-5f9c-47c3-a01b-2e23e1e27a82-kube-api-access-vbbg4\") pod \"mariadb-operator-controller-manager-f9fb45f8f-7mh2r\" (UID: \"cf7ce4de-5f9c-47c3-a01b-2e23e1e27a82\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-7mh2r" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.233461 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl4mf\" (UniqueName: \"kubernetes.io/projected/6a282f5d-5a52-4a80-bf82-6eea47007564-kube-api-access-tl4mf\") pod \"nova-operator-controller-manager-5df598886f-rw77z\" (UID: \"6a282f5d-5a52-4a80-bf82-6eea47007564\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-rw77z" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.286490 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-dkc8b"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.287604 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-dkc8b" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.292396 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-lmpg9" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.295714 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-dkc8b"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.296207 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6mxx\" (UniqueName: \"kubernetes.io/projected/3ffe8d9e-bb57-4156-8eff-fdb070a67e6d-kube-api-access-g6mxx\") pod \"placement-operator-controller-manager-68b6c87b68-6r5r9\" (UID: \"3ffe8d9e-bb57-4156-8eff-fdb070a67e6d\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-6r5r9" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.296241 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fr2p\" (UniqueName: \"kubernetes.io/projected/96bb0f12-b144-4f24-9b48-407519d51c6e-kube-api-access-2fr2p\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm\" (UID: \"96bb0f12-b144-4f24-9b48-407519d51c6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.296260 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96bb0f12-b144-4f24-9b48-407519d51c6e-cert\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm\" (UID: \"96bb0f12-b144-4f24-9b48-407519d51c6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.296316 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86ns4\" (UniqueName: \"kubernetes.io/projected/f8aea845-c4c0-469f-ac39-9f4525e69ec5-kube-api-access-86ns4\") pod \"octavia-operator-controller-manager-69fdcfc5f5-zqxbk\" (UID: \"f8aea845-c4c0-469f-ac39-9f4525e69ec5\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.296346 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxzq\" (UniqueName: \"kubernetes.io/projected/8e0f70ea-9d3e-4ada-834a-1134f8485204-kube-api-access-mnxzq\") pod \"test-operator-controller-manager-5458f77c4-phcvn\" (UID: \"8e0f70ea-9d3e-4ada-834a-1134f8485204\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-phcvn" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.296370 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lpqh\" (UniqueName: \"kubernetes.io/projected/6c2fa94f-b7fc-496b-a2f4-81695f1d86b2-kube-api-access-6lpqh\") pod \"telemetry-operator-controller-manager-67cfc6749b-7lzg8\" (UID: \"6c2fa94f-b7fc-496b-a2f4-81695f1d86b2\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.296388 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxj8m\" (UniqueName: \"kubernetes.io/projected/d062b1e7-96b9-48e0-acb3-528dc3c7e59c-kube-api-access-sxj8m\") pod \"ovn-operator-controller-manager-79db49b9fb-q9bmt\" (UID: \"d062b1e7-96b9-48e0-acb3-528dc3c7e59c\") " pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-q9bmt" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.296406 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbvrp\" (UniqueName: \"kubernetes.io/projected/95926917-39a4-4757-b246-874a680f97ce-kube-api-access-dbvrp\") pod \"swift-operator-controller-manager-db6d7f97b-vxg74\" (UID: \"95926917-39a4-4757-b246-874a680f97ce\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74" Oct 08 22:39:41 crc kubenswrapper[4834]: E1008 22:39:41.296648 4834 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 22:39:41 crc kubenswrapper[4834]: E1008 22:39:41.296691 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96bb0f12-b144-4f24-9b48-407519d51c6e-cert podName:96bb0f12-b144-4f24-9b48-407519d51c6e nodeName:}" failed. No retries permitted until 2025-10-08 22:39:41.796677918 +0000 UTC m=+989.619562654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/96bb0f12-b144-4f24-9b48-407519d51c6e-cert") pod "openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" (UID: "96bb0f12-b144-4f24-9b48-407519d51c6e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.345260 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-rw77z" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.347172 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-7mh2r" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.359525 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbvrp\" (UniqueName: \"kubernetes.io/projected/95926917-39a4-4757-b246-874a680f97ce-kube-api-access-dbvrp\") pod \"swift-operator-controller-manager-db6d7f97b-vxg74\" (UID: \"95926917-39a4-4757-b246-874a680f97ce\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.359540 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fr2p\" (UniqueName: \"kubernetes.io/projected/96bb0f12-b144-4f24-9b48-407519d51c6e-kube-api-access-2fr2p\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm\" (UID: \"96bb0f12-b144-4f24-9b48-407519d51c6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.369907 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxj8m\" (UniqueName: \"kubernetes.io/projected/d062b1e7-96b9-48e0-acb3-528dc3c7e59c-kube-api-access-sxj8m\") pod \"ovn-operator-controller-manager-79db49b9fb-q9bmt\" (UID: \"d062b1e7-96b9-48e0-acb3-528dc3c7e59c\") " pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-q9bmt" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.373654 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-jkmf4" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.378176 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lpqh\" (UniqueName: \"kubernetes.io/projected/6c2fa94f-b7fc-496b-a2f4-81695f1d86b2-kube-api-access-6lpqh\") pod \"telemetry-operator-controller-manager-67cfc6749b-7lzg8\" (UID: \"6c2fa94f-b7fc-496b-a2f4-81695f1d86b2\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.379003 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86ns4\" (UniqueName: \"kubernetes.io/projected/f8aea845-c4c0-469f-ac39-9f4525e69ec5-kube-api-access-86ns4\") pod \"octavia-operator-controller-manager-69fdcfc5f5-zqxbk\" (UID: \"f8aea845-c4c0-469f-ac39-9f4525e69ec5\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.379077 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6mxx\" (UniqueName: \"kubernetes.io/projected/3ffe8d9e-bb57-4156-8eff-fdb070a67e6d-kube-api-access-g6mxx\") pod \"placement-operator-controller-manager-68b6c87b68-6r5r9\" (UID: \"3ffe8d9e-bb57-4156-8eff-fdb070a67e6d\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-6r5r9" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.381527 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnxzq\" (UniqueName: \"kubernetes.io/projected/8e0f70ea-9d3e-4ada-834a-1134f8485204-kube-api-access-mnxzq\") pod \"test-operator-controller-manager-5458f77c4-phcvn\" (UID: \"8e0f70ea-9d3e-4ada-834a-1134f8485204\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-phcvn" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.400671 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa45e46-72a6-4f2f-af9e-ce679038b8f1-cert\") pod \"infra-operator-controller-manager-656bcbd775-mxst7\" (UID: \"baa45e46-72a6-4f2f-af9e-ce679038b8f1\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.400825 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqscf\" (UniqueName: \"kubernetes.io/projected/3286b799-dca1-495d-b329-9c46d97e024b-kube-api-access-fqscf\") pod \"watcher-operator-controller-manager-7f554bff7b-dkc8b\" (UID: \"3286b799-dca1-495d-b329-9c46d97e024b\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-dkc8b" Oct 08 22:39:41 crc kubenswrapper[4834]: E1008 22:39:41.408603 4834 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 08 22:39:41 crc kubenswrapper[4834]: E1008 22:39:41.408746 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/baa45e46-72a6-4f2f-af9e-ce679038b8f1-cert podName:baa45e46-72a6-4f2f-af9e-ce679038b8f1 nodeName:}" failed. No retries permitted until 2025-10-08 22:39:42.408711886 +0000 UTC m=+990.231596632 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/baa45e46-72a6-4f2f-af9e-ce679038b8f1-cert") pod "infra-operator-controller-manager-656bcbd775-mxst7" (UID: "baa45e46-72a6-4f2f-af9e-ce679038b8f1") : secret "infra-operator-webhook-server-cert" not found Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.451534 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-q9bmt" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.463210 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.464477 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.468304 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.468419 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vx8xb" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.469399 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-6r5r9" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.497891 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.502545 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqscf\" (UniqueName: \"kubernetes.io/projected/3286b799-dca1-495d-b329-9c46d97e024b-kube-api-access-fqscf\") pod \"watcher-operator-controller-manager-7f554bff7b-dkc8b\" (UID: \"3286b799-dca1-495d-b329-9c46d97e024b\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-dkc8b" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.507439 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.514525 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.520035 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqscf\" (UniqueName: \"kubernetes.io/projected/3286b799-dca1-495d-b329-9c46d97e024b-kube-api-access-fqscf\") pod \"watcher-operator-controller-manager-7f554bff7b-dkc8b\" (UID: \"3286b799-dca1-495d-b329-9c46d97e024b\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-dkc8b" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.543820 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5458f77c4-phcvn" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.546286 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.547569 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.551330 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-9s4dx" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.608581 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.608621 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-dvbzh"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.609211 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bnnl\" (UniqueName: \"kubernetes.io/projected/00f8a5ab-e561-4b67-a56e-791342c7dbb4-kube-api-access-6bnnl\") pod \"openstack-operator-controller-manager-7d6957655c-ggfj6\" (UID: \"00f8a5ab-e561-4b67-a56e-791342c7dbb4\") " pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.609305 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00f8a5ab-e561-4b67-a56e-791342c7dbb4-cert\") pod \"openstack-operator-controller-manager-7d6957655c-ggfj6\" (UID: \"00f8a5ab-e561-4b67-a56e-791342c7dbb4\") " pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.651587 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.695698 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-dkc8b" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.713653 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bnnl\" (UniqueName: \"kubernetes.io/projected/00f8a5ab-e561-4b67-a56e-791342c7dbb4-kube-api-access-6bnnl\") pod \"openstack-operator-controller-manager-7d6957655c-ggfj6\" (UID: \"00f8a5ab-e561-4b67-a56e-791342c7dbb4\") " pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.713783 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt5sc\" (UniqueName: \"kubernetes.io/projected/0f36bacd-60f1-41f9-a0e9-1429cecade32-kube-api-access-wt5sc\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw\" (UID: \"0f36bacd-60f1-41f9-a0e9-1429cecade32\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.713810 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00f8a5ab-e561-4b67-a56e-791342c7dbb4-cert\") pod \"openstack-operator-controller-manager-7d6957655c-ggfj6\" (UID: \"00f8a5ab-e561-4b67-a56e-791342c7dbb4\") " pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" Oct 08 22:39:41 crc kubenswrapper[4834]: E1008 22:39:41.714209 4834 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 08 22:39:41 crc kubenswrapper[4834]: E1008 22:39:41.714265 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00f8a5ab-e561-4b67-a56e-791342c7dbb4-cert podName:00f8a5ab-e561-4b67-a56e-791342c7dbb4 nodeName:}" failed. No retries permitted until 2025-10-08 22:39:42.214249287 +0000 UTC m=+990.037134033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/00f8a5ab-e561-4b67-a56e-791342c7dbb4-cert") pod "openstack-operator-controller-manager-7d6957655c-ggfj6" (UID: "00f8a5ab-e561-4b67-a56e-791342c7dbb4") : secret "webhook-server-cert" not found Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.727593 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-8fhdd"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.758927 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-nf4vz"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.772123 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-8fbc8"] Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.785106 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bnnl\" (UniqueName: \"kubernetes.io/projected/00f8a5ab-e561-4b67-a56e-791342c7dbb4-kube-api-access-6bnnl\") pod \"openstack-operator-controller-manager-7d6957655c-ggfj6\" (UID: \"00f8a5ab-e561-4b67-a56e-791342c7dbb4\") " pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.816257 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt5sc\" (UniqueName: \"kubernetes.io/projected/0f36bacd-60f1-41f9-a0e9-1429cecade32-kube-api-access-wt5sc\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw\" (UID: \"0f36bacd-60f1-41f9-a0e9-1429cecade32\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.816351 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96bb0f12-b144-4f24-9b48-407519d51c6e-cert\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm\" (UID: \"96bb0f12-b144-4f24-9b48-407519d51c6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.832043 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96bb0f12-b144-4f24-9b48-407519d51c6e-cert\") pod \"openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm\" (UID: \"96bb0f12-b144-4f24-9b48-407519d51c6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.838255 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt5sc\" (UniqueName: \"kubernetes.io/projected/0f36bacd-60f1-41f9-a0e9-1429cecade32-kube-api-access-wt5sc\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw\" (UID: \"0f36bacd-60f1-41f9-a0e9-1429cecade32\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw" Oct 08 22:39:41 crc kubenswrapper[4834]: I1008 22:39:41.911485 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw" Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.014853 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.193837 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-8fbc8" event={"ID":"6dbd0034-b992-4a64-ab92-268abe380d03","Type":"ContainerStarted","Data":"9366c68e747ed4281df1af7fc60a3f7aff5e8e6cb92e93880ac67bc62bc76469"} Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.195346 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-dvbzh" event={"ID":"864591d4-af96-44e6-8a1f-a01bf0b9fb44","Type":"ContainerStarted","Data":"d76cd7fffffa621ae895e776acbf7851f1758aade3c5626cc8758d1ec5e326a1"} Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.196436 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-nf4vz" event={"ID":"4f3521e9-05df-408a-a765-7a7ba0046afa","Type":"ContainerStarted","Data":"bc896c3e8e3884f7afcc98992623d0035e6b91f240967d87e63f86e2a7c84ebf"} Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.199607 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8fhdd" event={"ID":"775cc910-4ea9-4da1-b35a-a31b4c880010","Type":"ContainerStarted","Data":"144fd8c741de7f7655ec120c5d35de4acb78b6b76b518fcdb53578d421333318"} Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.224362 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00f8a5ab-e561-4b67-a56e-791342c7dbb4-cert\") pod \"openstack-operator-controller-manager-7d6957655c-ggfj6\" (UID: \"00f8a5ab-e561-4b67-a56e-791342c7dbb4\") " pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" Oct 08 22:39:42 crc kubenswrapper[4834]: E1008 22:39:42.224552 4834 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 08 22:39:42 crc kubenswrapper[4834]: E1008 22:39:42.224603 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00f8a5ab-e561-4b67-a56e-791342c7dbb4-cert podName:00f8a5ab-e561-4b67-a56e-791342c7dbb4 nodeName:}" failed. No retries permitted until 2025-10-08 22:39:43.224590397 +0000 UTC m=+991.047475143 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/00f8a5ab-e561-4b67-a56e-791342c7dbb4-cert") pod "openstack-operator-controller-manager-7d6957655c-ggfj6" (UID: "00f8a5ab-e561-4b67-a56e-791342c7dbb4") : secret "webhook-server-cert" not found Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.326781 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-z5gzx"] Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.353509 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-7mh2r"] Oct 08 22:39:42 crc kubenswrapper[4834]: W1008 22:39:42.354505 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf7ce4de_5f9c_47c3_a01b_2e23e1e27a82.slice/crio-26e121caf5904162398bd02332a5a0cd3cf9afe3b51b6ecefd58fefefae96ff2 WatchSource:0}: Error finding container 26e121caf5904162398bd02332a5a0cd3cf9afe3b51b6ecefd58fefefae96ff2: Status 404 returned error can't find the container with id 26e121caf5904162398bd02332a5a0cd3cf9afe3b51b6ecefd58fefefae96ff2 Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.427455 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa45e46-72a6-4f2f-af9e-ce679038b8f1-cert\") pod \"infra-operator-controller-manager-656bcbd775-mxst7\" (UID: \"baa45e46-72a6-4f2f-af9e-ce679038b8f1\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.441134 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/baa45e46-72a6-4f2f-af9e-ce679038b8f1-cert\") pod \"infra-operator-controller-manager-656bcbd775-mxst7\" (UID: \"baa45e46-72a6-4f2f-af9e-ce679038b8f1\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.500088 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-q2qld"] Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.506535 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-lwsbk"] Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.522701 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-xprwf"] Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.529378 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-ssskr"] Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.553880 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.621835 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-jkmf4"] Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.631409 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79db49b9fb-q9bmt"] Oct 08 22:39:42 crc kubenswrapper[4834]: W1008 22:39:42.635660 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cdb778f_1017_4f86_9443_85d7bf158bd6.slice/crio-72aae0afc2b61d72ceffea065cf90a8dc724abbe77c1fa7704080ec27b715121 WatchSource:0}: Error finding container 72aae0afc2b61d72ceffea065cf90a8dc724abbe77c1fa7704080ec27b715121: Status 404 returned error can't find the container with id 72aae0afc2b61d72ceffea065cf90a8dc724abbe77c1fa7704080ec27b715121 Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.638448 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-6r5r9"] Oct 08 22:39:42 crc kubenswrapper[4834]: W1008 22:39:42.641574 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ffe8d9e_bb57_4156_8eff_fdb070a67e6d.slice/crio-6654daf6596b249ffbdac469f9ad15a0c4d25ef071224f4812b78e1decb8ad5f WatchSource:0}: Error finding container 6654daf6596b249ffbdac469f9ad15a0c4d25ef071224f4812b78e1decb8ad5f: Status 404 returned error can't find the container with id 6654daf6596b249ffbdac469f9ad15a0c4d25ef071224f4812b78e1decb8ad5f Oct 08 22:39:42 crc kubenswrapper[4834]: W1008 22:39:42.648408 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3286b799_dca1_495d_b329_9c46d97e024b.slice/crio-e8aa515241e60cd0f110e90e40418727190e0f271799025c77f1aec58394616c WatchSource:0}: Error finding container e8aa515241e60cd0f110e90e40418727190e0f271799025c77f1aec58394616c: Status 404 returned error can't find the container with id e8aa515241e60cd0f110e90e40418727190e0f271799025c77f1aec58394616c Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.649241 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-dkc8b"] Oct 08 22:39:42 crc kubenswrapper[4834]: E1008 22:39:42.658475 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tl4mf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5df598886f-rw77z_openstack-operators(6a282f5d-5a52-4a80-bf82-6eea47007564): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.660228 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-rw77z"] Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.741948 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-phcvn"] Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.746246 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw"] Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.750325 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74"] Oct 08 22:39:42 crc kubenswrapper[4834]: E1008 22:39:42.763670 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wt5sc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw_openstack-operators(0f36bacd-60f1-41f9-a0e9-1429cecade32): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 22:39:42 crc kubenswrapper[4834]: E1008 22:39:42.763754 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mnxzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5458f77c4-phcvn_openstack-operators(8e0f70ea-9d3e-4ada-834a-1134f8485204): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 22:39:42 crc kubenswrapper[4834]: E1008 22:39:42.764042 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dbvrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-db6d7f97b-vxg74_openstack-operators(95926917-39a4-4757-b246-874a680f97ce): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 22:39:42 crc kubenswrapper[4834]: E1008 22:39:42.764880 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw" podUID="0f36bacd-60f1-41f9-a0e9-1429cecade32" Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.813251 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7"] Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.867534 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk"] Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.884877 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8"] Oct 08 22:39:42 crc kubenswrapper[4834]: W1008 22:39:42.885351 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8aea845_c4c0_469f_ac39_9f4525e69ec5.slice/crio-cfcd9bb3826ac0a2a1d62426aec6d3982ca2ec10fb521b03904096dc75df8518 WatchSource:0}: Error finding container cfcd9bb3826ac0a2a1d62426aec6d3982ca2ec10fb521b03904096dc75df8518: Status 404 returned error can't find the container with id cfcd9bb3826ac0a2a1d62426aec6d3982ca2ec10fb521b03904096dc75df8518 Oct 08 22:39:42 crc kubenswrapper[4834]: E1008 22:39:42.888203 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-86ns4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69fdcfc5f5-zqxbk_openstack-operators(f8aea845-c4c0-469f-ac39-9f4525e69ec5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 22:39:42 crc kubenswrapper[4834]: E1008 22:39:42.914840 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6lpqh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-67cfc6749b-7lzg8_openstack-operators(6c2fa94f-b7fc-496b-a2f4-81695f1d86b2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 08 22:39:42 crc kubenswrapper[4834]: I1008 22:39:42.960677 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm"] Oct 08 22:39:42 crc kubenswrapper[4834]: E1008 22:39:42.962796 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5df598886f-rw77z" podUID="6a282f5d-5a52-4a80-bf82-6eea47007564" Oct 08 22:39:43 crc kubenswrapper[4834]: E1008 22:39:43.091461 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-phcvn" podUID="8e0f70ea-9d3e-4ada-834a-1134f8485204" Oct 08 22:39:43 crc kubenswrapper[4834]: E1008 22:39:43.096219 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74" podUID="95926917-39a4-4757-b246-874a680f97ce" Oct 08 22:39:43 crc kubenswrapper[4834]: E1008 22:39:43.197408 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk" podUID="f8aea845-c4c0-469f-ac39-9f4525e69ec5" Oct 08 22:39:43 crc kubenswrapper[4834]: E1008 22:39:43.200498 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8" podUID="6c2fa94f-b7fc-496b-a2f4-81695f1d86b2" Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.215570 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8" event={"ID":"6c2fa94f-b7fc-496b-a2f4-81695f1d86b2","Type":"ContainerStarted","Data":"7c2062804d5f59aeeb0b5a252e8eea4cf2f5a535a2476e56c3ad9cd8db4dd322"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.215640 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8" event={"ID":"6c2fa94f-b7fc-496b-a2f4-81695f1d86b2","Type":"ContainerStarted","Data":"58d9e718d3e10e951910d3e85741ef507f9fed92376c5c9a77227c903ff3984e"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.218812 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-jkmf4" event={"ID":"0cdb778f-1017-4f86-9443-85d7bf158bd6","Type":"ContainerStarted","Data":"72aae0afc2b61d72ceffea065cf90a8dc724abbe77c1fa7704080ec27b715121"} Oct 08 22:39:43 crc kubenswrapper[4834]: E1008 22:39:43.226005 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8" podUID="6c2fa94f-b7fc-496b-a2f4-81695f1d86b2" Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.248737 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-z5gzx" event={"ID":"306dc2e6-3f9b-45c6-b615-75a6d2098857","Type":"ContainerStarted","Data":"6a8515bd5b951c0f8890711f5136c42d99c019a3910d0fc8a75a2cfda37a3305"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.251694 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-phcvn" event={"ID":"8e0f70ea-9d3e-4ada-834a-1134f8485204","Type":"ContainerStarted","Data":"7b5696de9858bbc0219b4fff7d599d3d11efb0acdd20b4674b72b5b5f9746da4"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.251742 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-phcvn" event={"ID":"8e0f70ea-9d3e-4ada-834a-1134f8485204","Type":"ContainerStarted","Data":"d10c9a75e27c7d662b827e77b54e8649511062f953224380371b62181c5424cf"} Oct 08 22:39:43 crc kubenswrapper[4834]: E1008 22:39:43.253868 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-phcvn" podUID="8e0f70ea-9d3e-4ada-834a-1134f8485204" Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.254561 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-6r5r9" event={"ID":"3ffe8d9e-bb57-4156-8eff-fdb070a67e6d","Type":"ContainerStarted","Data":"6654daf6596b249ffbdac469f9ad15a0c4d25ef071224f4812b78e1decb8ad5f"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.258538 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xprwf" event={"ID":"b886f9b3-c296-4445-9a60-cb6809463741","Type":"ContainerStarted","Data":"14722b3b5db60dc5091b49c025adfd00eab679c7f85dbb677d3f3e3ade6a5e90"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.273274 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00f8a5ab-e561-4b67-a56e-791342c7dbb4-cert\") pod \"openstack-operator-controller-manager-7d6957655c-ggfj6\" (UID: \"00f8a5ab-e561-4b67-a56e-791342c7dbb4\") " pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.279464 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-rw77z" event={"ID":"6a282f5d-5a52-4a80-bf82-6eea47007564","Type":"ContainerStarted","Data":"3c7bbe17ce72afd40c598f4caf4104dd0f9acacbfabcb72c417eac8684215955"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.279521 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-rw77z" event={"ID":"6a282f5d-5a52-4a80-bf82-6eea47007564","Type":"ContainerStarted","Data":"20be158472869d47ba39a0a9ba270956dd28d92f0089ac6d65471243f7689dbb"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.283042 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00f8a5ab-e561-4b67-a56e-791342c7dbb4-cert\") pod \"openstack-operator-controller-manager-7d6957655c-ggfj6\" (UID: \"00f8a5ab-e561-4b67-a56e-791342c7dbb4\") " pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.284130 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw" event={"ID":"0f36bacd-60f1-41f9-a0e9-1429cecade32","Type":"ContainerStarted","Data":"bb1c93c461ff917ddcc2c5d5084a8c14b5712e89cded89d1dc860a79a885b357"} Oct 08 22:39:43 crc kubenswrapper[4834]: E1008 22:39:43.284725 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5df598886f-rw77z" podUID="6a282f5d-5a52-4a80-bf82-6eea47007564" Oct 08 22:39:43 crc kubenswrapper[4834]: E1008 22:39:43.297636 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw" podUID="0f36bacd-60f1-41f9-a0e9-1429cecade32" Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.303505 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-dkc8b" event={"ID":"3286b799-dca1-495d-b329-9c46d97e024b","Type":"ContainerStarted","Data":"e8aa515241e60cd0f110e90e40418727190e0f271799025c77f1aec58394616c"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.309991 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.328392 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74" event={"ID":"95926917-39a4-4757-b246-874a680f97ce","Type":"ContainerStarted","Data":"638296463faaa7b9dc9f35e8968d688739df5a4aa448b79588d2f9c797c03539"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.328444 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74" event={"ID":"95926917-39a4-4757-b246-874a680f97ce","Type":"ContainerStarted","Data":"2bb61e6f4e66b2f611efb05b6af1f8a025d9180bb6c75a4ee1fc14832c4c698a"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.333480 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk" event={"ID":"f8aea845-c4c0-469f-ac39-9f4525e69ec5","Type":"ContainerStarted","Data":"4bef5536ecb83ef4c3a1cf5548212069cc59b70fe11de5ff344eedc965777270"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.333536 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk" event={"ID":"f8aea845-c4c0-469f-ac39-9f4525e69ec5","Type":"ContainerStarted","Data":"cfcd9bb3826ac0a2a1d62426aec6d3982ca2ec10fb521b03904096dc75df8518"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.334584 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" event={"ID":"96bb0f12-b144-4f24-9b48-407519d51c6e","Type":"ContainerStarted","Data":"29d38cc5268228a632c0aebb2ef6e7bc73e23c8ba46dc0d7513af1b23c1333e7"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.335624 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-lwsbk" event={"ID":"5d7e03d9-baa9-4867-9fbc-91a82a36f4e2","Type":"ContainerStarted","Data":"d8446259aeb7b8b93c33238875d1a6d529ea13b1d6b3ac5e9eaa62b6f85a703c"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.336492 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-7mh2r" event={"ID":"cf7ce4de-5f9c-47c3-a01b-2e23e1e27a82","Type":"ContainerStarted","Data":"26e121caf5904162398bd02332a5a0cd3cf9afe3b51b6ecefd58fefefae96ff2"} Oct 08 22:39:43 crc kubenswrapper[4834]: E1008 22:39:43.337461 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk" podUID="f8aea845-c4c0-469f-ac39-9f4525e69ec5" Oct 08 22:39:43 crc kubenswrapper[4834]: E1008 22:39:43.337690 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74" podUID="95926917-39a4-4757-b246-874a680f97ce" Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.337830 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ssskr" event={"ID":"84427f76-6342-4e6b-9875-56b2d3db0fac","Type":"ContainerStarted","Data":"1cccd59042a5027a678a32429d408690b562b6e34a60eb66541d58f9a2506412"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.348411 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-q9bmt" event={"ID":"d062b1e7-96b9-48e0-acb3-528dc3c7e59c","Type":"ContainerStarted","Data":"2e34fe855c020e1dc567f12f4b0857b87ee9346b6aa216ebb5ff5fc8506189e1"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.362492 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-q2qld" event={"ID":"5452435d-906d-4f08-87e9-168187eb4d5c","Type":"ContainerStarted","Data":"a78d1be4ae1716b9f29081c3be0b36af98d3ca9c5e97d47cd3f0fe5419a8e24e"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.364397 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" event={"ID":"baa45e46-72a6-4f2f-af9e-ce679038b8f1","Type":"ContainerStarted","Data":"6af7c767233443609daf4f31a56eea8758c466b58e3107bf859c7071d45dc1a4"} Oct 08 22:39:43 crc kubenswrapper[4834]: I1008 22:39:43.742553 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6"] Oct 08 22:39:44 crc kubenswrapper[4834]: I1008 22:39:44.377748 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" event={"ID":"00f8a5ab-e561-4b67-a56e-791342c7dbb4","Type":"ContainerStarted","Data":"3ebfbda4752cb9b3bea5194f0c204bb9a98d4fdb0d3183a5598ca228287d4c39"} Oct 08 22:39:44 crc kubenswrapper[4834]: I1008 22:39:44.378115 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" event={"ID":"00f8a5ab-e561-4b67-a56e-791342c7dbb4","Type":"ContainerStarted","Data":"200b58db58acd7fa6622b2935fe3170e617202a5660226a935894cb16de4dd13"} Oct 08 22:39:44 crc kubenswrapper[4834]: E1008 22:39:44.379673 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8" podUID="6c2fa94f-b7fc-496b-a2f4-81695f1d86b2" Oct 08 22:39:44 crc kubenswrapper[4834]: E1008 22:39:44.380032 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74" podUID="95926917-39a4-4757-b246-874a680f97ce" Oct 08 22:39:44 crc kubenswrapper[4834]: E1008 22:39:44.383972 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw" podUID="0f36bacd-60f1-41f9-a0e9-1429cecade32" Oct 08 22:39:44 crc kubenswrapper[4834]: E1008 22:39:44.384047 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-phcvn" podUID="8e0f70ea-9d3e-4ada-834a-1134f8485204" Oct 08 22:39:44 crc kubenswrapper[4834]: E1008 22:39:44.384085 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk" podUID="f8aea845-c4c0-469f-ac39-9f4525e69ec5" Oct 08 22:39:44 crc kubenswrapper[4834]: E1008 22:39:44.384123 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5df598886f-rw77z" podUID="6a282f5d-5a52-4a80-bf82-6eea47007564" Oct 08 22:39:45 crc kubenswrapper[4834]: I1008 22:39:45.402394 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" event={"ID":"00f8a5ab-e561-4b67-a56e-791342c7dbb4","Type":"ContainerStarted","Data":"f08975605d8d3c63ffa769b784148ec8fa4294eecb04e0934920ac42182ae285"} Oct 08 22:39:45 crc kubenswrapper[4834]: I1008 22:39:45.402838 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" Oct 08 22:39:45 crc kubenswrapper[4834]: I1008 22:39:45.440794 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" podStartSLOduration=4.440775102 podStartE2EDuration="4.440775102s" podCreationTimestamp="2025-10-08 22:39:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:39:45.43865011 +0000 UTC m=+993.261534856" watchObservedRunningTime="2025-10-08 22:39:45.440775102 +0000 UTC m=+993.263659848" Oct 08 22:39:53 crc kubenswrapper[4834]: I1008 22:39:53.317671 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7d6957655c-ggfj6" Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.508305 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-q2qld" event={"ID":"5452435d-906d-4f08-87e9-168187eb4d5c","Type":"ContainerStarted","Data":"8ae558f5af18d63d65d3b5e55d3b6c1ba30706af82173f75a6586069e30c85ec"} Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.521060 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-nf4vz" event={"ID":"4f3521e9-05df-408a-a765-7a7ba0046afa","Type":"ContainerStarted","Data":"ebc1110083fda30b67842ddb52e6969fe5c6b3f4aaa683fb7cf467cfa83f7371"} Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.540351 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-jkmf4" event={"ID":"0cdb778f-1017-4f86-9443-85d7bf158bd6","Type":"ContainerStarted","Data":"40590669aefcbf9b0d66a7bdb8f30ac1c78cb93ad42c997d882b9c37cb33103a"} Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.556345 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8fhdd" event={"ID":"775cc910-4ea9-4da1-b35a-a31b4c880010","Type":"ContainerStarted","Data":"3c4d2f08fb00c7db4dda0549a397de6609c89903dffa3ebce1b65db28ff85429"} Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.584018 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xprwf" event={"ID":"b886f9b3-c296-4445-9a60-cb6809463741","Type":"ContainerStarted","Data":"9fbb0e8c0c2be4099e7be3dfbfbde5bc6f7d24aec167633be6adbc26ef3ab125"} Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.593409 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-7mh2r" event={"ID":"cf7ce4de-5f9c-47c3-a01b-2e23e1e27a82","Type":"ContainerStarted","Data":"95d35ad4a5f9b8f863cf7d33420f9a0963a3a739f23ee55423494919acfbe32b"} Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.641529 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-dkc8b" event={"ID":"3286b799-dca1-495d-b329-9c46d97e024b","Type":"ContainerStarted","Data":"452634a32da37d5c7d5efea7141336f6cab0e92c3f0e915ebc4ad6da60523aa3"} Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.685681 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" event={"ID":"96bb0f12-b144-4f24-9b48-407519d51c6e","Type":"ContainerStarted","Data":"2bc7a13321dbfe94a7016b8f8cb463e140b3322c230248ee235fda69c595ee5c"} Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.709163 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-lwsbk" event={"ID":"5d7e03d9-baa9-4867-9fbc-91a82a36f4e2","Type":"ContainerStarted","Data":"28f100b523fabbf11ded39970de4a7473185739fbfa5ff7c743b0758cfcf64b1"} Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.727190 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-q9bmt" event={"ID":"d062b1e7-96b9-48e0-acb3-528dc3c7e59c","Type":"ContainerStarted","Data":"a02d085e10cd9857142dbc094ab69508f7d0eb618770b0263910d4ca05aa64c7"} Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.731818 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-8fbc8" event={"ID":"6dbd0034-b992-4a64-ab92-268abe380d03","Type":"ContainerStarted","Data":"462c2a111966395f26722e899b248af5b34a9a5284f3232486cfca9592747016"} Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.744793 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" event={"ID":"baa45e46-72a6-4f2f-af9e-ce679038b8f1","Type":"ContainerStarted","Data":"becf2f761692b4a1fa47a72a19b0d3db12d3c5cc4244bacfbabf6c297817b25c"} Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.752629 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-z5gzx" event={"ID":"306dc2e6-3f9b-45c6-b615-75a6d2098857","Type":"ContainerStarted","Data":"bf41ccb280290f6cca1a2e49cc1235f8caf5b53a4b4936bf436b2019b136eac6"} Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.753542 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-z5gzx" Oct 08 22:39:55 crc kubenswrapper[4834]: I1008 22:39:55.762444 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-6r5r9" event={"ID":"3ffe8d9e-bb57-4156-8eff-fdb070a67e6d","Type":"ContainerStarted","Data":"cd8cb9ab93e02873b2dbcfce56d9f25018f776233b33a0f32ff0f3a02dd2cb54"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.776055 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-lwsbk" event={"ID":"5d7e03d9-baa9-4867-9fbc-91a82a36f4e2","Type":"ContainerStarted","Data":"71af5da7e3d9c966ee5ff1fa71d2c1a3aeecf6164883746656c259b13e02743e"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.776542 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-lwsbk" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.780870 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-jkmf4" event={"ID":"0cdb778f-1017-4f86-9443-85d7bf158bd6","Type":"ContainerStarted","Data":"d9c83dad2bb66c8ad9ff0149d3010533540bdf5fd8e89734e544c343427f60da"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.781093 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-jkmf4" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.786074 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-6r5r9" event={"ID":"3ffe8d9e-bb57-4156-8eff-fdb070a67e6d","Type":"ContainerStarted","Data":"4fc3fcf01fe2faa7dcc4f28d6653319b8e248a35af806b45535a9b92ae827f04"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.786805 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-6r5r9" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.792680 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xprwf" event={"ID":"b886f9b3-c296-4445-9a60-cb6809463741","Type":"ContainerStarted","Data":"822437fe9b85f3d731a7a122455ec625ba8385da6d3a99d013c805ca0810f777"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.792837 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xprwf" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.797429 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-lwsbk" podStartSLOduration=4.711611295 podStartE2EDuration="16.797416111s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.523708322 +0000 UTC m=+990.346593058" lastFinishedPulling="2025-10-08 22:39:54.609513128 +0000 UTC m=+1002.432397874" observedRunningTime="2025-10-08 22:39:56.796832446 +0000 UTC m=+1004.619717192" watchObservedRunningTime="2025-10-08 22:39:56.797416111 +0000 UTC m=+1004.620300857" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.801852 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-z5gzx" podStartSLOduration=4.581023266 podStartE2EDuration="16.801841968s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.333747564 +0000 UTC m=+990.156632300" lastFinishedPulling="2025-10-08 22:39:54.554566256 +0000 UTC m=+1002.377451002" observedRunningTime="2025-10-08 22:39:55.779435918 +0000 UTC m=+1003.602320654" watchObservedRunningTime="2025-10-08 22:39:56.801841968 +0000 UTC m=+1004.624726714" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.805302 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-z5gzx" event={"ID":"306dc2e6-3f9b-45c6-b615-75a6d2098857","Type":"ContainerStarted","Data":"5be6d4e69f1669ee7bdd26193c47629eea6f8f5483c64cb49161d4acc856f52f"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.813608 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-q9bmt" event={"ID":"d062b1e7-96b9-48e0-acb3-528dc3c7e59c","Type":"ContainerStarted","Data":"d29efc065870f7754aab5c2adfc292094556c0ec6b7c93a1c63fc5868a173b33"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.813794 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-q9bmt" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.817570 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-jkmf4" podStartSLOduration=4.838255896 podStartE2EDuration="16.817547329s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.638101967 +0000 UTC m=+990.460986713" lastFinishedPulling="2025-10-08 22:39:54.6173934 +0000 UTC m=+1002.440278146" observedRunningTime="2025-10-08 22:39:56.815553031 +0000 UTC m=+1004.638437777" watchObservedRunningTime="2025-10-08 22:39:56.817547329 +0000 UTC m=+1004.640432075" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.820738 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-dkc8b" event={"ID":"3286b799-dca1-495d-b329-9c46d97e024b","Type":"ContainerStarted","Data":"2b93185fa13b6163fe992a20bff17ab6991b0e9a06422a446cf2019a4b59dbcf"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.821187 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-dkc8b" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.824233 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-dvbzh" event={"ID":"864591d4-af96-44e6-8a1f-a01bf0b9fb44","Type":"ContainerStarted","Data":"0bb56b200f65259d664c9ae371c48754e46b4f1a1b70437832940aac6aa6b5f5"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.824306 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-dvbzh" event={"ID":"864591d4-af96-44e6-8a1f-a01bf0b9fb44","Type":"ContainerStarted","Data":"c3d434a8eaa8731347dec08a8c7aca07492cec86c6ad0af410517b61bd62e2dc"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.824404 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-dvbzh" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.828069 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" event={"ID":"baa45e46-72a6-4f2f-af9e-ce679038b8f1","Type":"ContainerStarted","Data":"efeed2a0ea39c628d719cd81acf3cde9375522537d7c5cd4c490d698968068b1"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.828211 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.829663 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-8fbc8" event={"ID":"6dbd0034-b992-4a64-ab92-268abe380d03","Type":"ContainerStarted","Data":"5abaa36f277db7e7902eb50d88fbd4188f041894adab927ff0a541dcb0647f47"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.829794 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-8fbc8" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.832461 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" event={"ID":"96bb0f12-b144-4f24-9b48-407519d51c6e","Type":"ContainerStarted","Data":"37b11663e1082d64bf45d0bae4989c26ff3e33b8643ecd5a0869e8eae1c4d65e"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.832581 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.839811 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-nf4vz" event={"ID":"4f3521e9-05df-408a-a765-7a7ba0046afa","Type":"ContainerStarted","Data":"083a39276722c943a45e85f844889195c9d065800fd3e15b8c494fadd28a8860"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.840896 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-nf4vz" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.841297 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xprwf" podStartSLOduration=4.680823817 podStartE2EDuration="16.841278484s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.524888231 +0000 UTC m=+990.347772987" lastFinishedPulling="2025-10-08 22:39:54.685342908 +0000 UTC m=+1002.508227654" observedRunningTime="2025-10-08 22:39:56.83653353 +0000 UTC m=+1004.659418276" watchObservedRunningTime="2025-10-08 22:39:56.841278484 +0000 UTC m=+1004.664163220" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.848327 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-7mh2r" event={"ID":"cf7ce4de-5f9c-47c3-a01b-2e23e1e27a82","Type":"ContainerStarted","Data":"444e4c9ad7079b099885606cff8ad9b33379d5ed30425ae075502c30811c898e"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.848504 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-7mh2r" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.850962 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8fhdd" event={"ID":"775cc910-4ea9-4da1-b35a-a31b4c880010","Type":"ContainerStarted","Data":"3b95e6230834da70d960bfe80628a34ac9d25d214a6608a8e6c279ef427ea2a7"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.851107 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8fhdd" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.858232 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-6r5r9" podStartSLOduration=4.885892892 podStartE2EDuration="16.858204465s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.64521918 +0000 UTC m=+990.468103926" lastFinishedPulling="2025-10-08 22:39:54.617530733 +0000 UTC m=+1002.440415499" observedRunningTime="2025-10-08 22:39:56.855132131 +0000 UTC m=+1004.678016877" watchObservedRunningTime="2025-10-08 22:39:56.858204465 +0000 UTC m=+1004.681089211" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.865336 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ssskr" event={"ID":"84427f76-6342-4e6b-9875-56b2d3db0fac","Type":"ContainerStarted","Data":"6d92220208b8e10880442e7fdbc18fc7d6c0ecc978543c889348e4198413476a"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.865396 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ssskr" event={"ID":"84427f76-6342-4e6b-9875-56b2d3db0fac","Type":"ContainerStarted","Data":"84784d999106e5cfa53b8906a0a5ea85dc109c6847bc519d20b7ac37c7dee8e4"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.866331 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ssskr" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.868437 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-q2qld" event={"ID":"5452435d-906d-4f08-87e9-168187eb4d5c","Type":"ContainerStarted","Data":"bec08efd03f8bcc3f230100462c7791a7e01a408bb1149fb736a621faf606c08"} Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.868932 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-q2qld" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.894187 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" podStartSLOduration=5.275284608 podStartE2EDuration="16.894165868s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.990198098 +0000 UTC m=+990.813082844" lastFinishedPulling="2025-10-08 22:39:54.609079338 +0000 UTC m=+1002.431964104" observedRunningTime="2025-10-08 22:39:56.889569435 +0000 UTC m=+1004.712454181" watchObservedRunningTime="2025-10-08 22:39:56.894165868 +0000 UTC m=+1004.717050604" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.916328 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-7mh2r" podStartSLOduration=4.662672548 podStartE2EDuration="16.916307965s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.355756698 +0000 UTC m=+990.178641444" lastFinishedPulling="2025-10-08 22:39:54.609392115 +0000 UTC m=+1002.432276861" observedRunningTime="2025-10-08 22:39:56.910908034 +0000 UTC m=+1004.733792770" watchObservedRunningTime="2025-10-08 22:39:56.916307965 +0000 UTC m=+1004.739192711" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.927373 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-dkc8b" podStartSLOduration=3.951979955 podStartE2EDuration="15.927347822s" podCreationTimestamp="2025-10-08 22:39:41 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.651049461 +0000 UTC m=+990.473934207" lastFinishedPulling="2025-10-08 22:39:54.626417298 +0000 UTC m=+1002.449302074" observedRunningTime="2025-10-08 22:39:56.92560944 +0000 UTC m=+1004.748494186" watchObservedRunningTime="2025-10-08 22:39:56.927347822 +0000 UTC m=+1004.750232568" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.951327 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8fhdd" podStartSLOduration=4.339663002 podStartE2EDuration="16.951296624s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:41.997421076 +0000 UTC m=+989.820305822" lastFinishedPulling="2025-10-08 22:39:54.609054668 +0000 UTC m=+1002.431939444" observedRunningTime="2025-10-08 22:39:56.948719591 +0000 UTC m=+1004.771604337" watchObservedRunningTime="2025-10-08 22:39:56.951296624 +0000 UTC m=+1004.774181370" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.970512 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-nf4vz" podStartSLOduration=4.387586944 podStartE2EDuration="16.970471168s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:41.983612981 +0000 UTC m=+989.806497727" lastFinishedPulling="2025-10-08 22:39:54.566497175 +0000 UTC m=+1002.389381951" observedRunningTime="2025-10-08 22:39:56.966867161 +0000 UTC m=+1004.789751907" watchObservedRunningTime="2025-10-08 22:39:56.970471168 +0000 UTC m=+1004.793355914" Oct 08 22:39:56 crc kubenswrapper[4834]: I1008 22:39:56.992172 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-q9bmt" podStartSLOduration=5.012748759 podStartE2EDuration="16.992153194s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.639418619 +0000 UTC m=+990.462303365" lastFinishedPulling="2025-10-08 22:39:54.618823024 +0000 UTC m=+1002.441707800" observedRunningTime="2025-10-08 22:39:56.988995317 +0000 UTC m=+1004.811880063" watchObservedRunningTime="2025-10-08 22:39:56.992153194 +0000 UTC m=+1004.815037940" Oct 08 22:39:57 crc kubenswrapper[4834]: I1008 22:39:57.017816 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-dvbzh" podStartSLOduration=4.013245474 podStartE2EDuration="17.017795876s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:41.622058121 +0000 UTC m=+989.444942867" lastFinishedPulling="2025-10-08 22:39:54.626608523 +0000 UTC m=+1002.449493269" observedRunningTime="2025-10-08 22:39:57.005738353 +0000 UTC m=+1004.828623099" watchObservedRunningTime="2025-10-08 22:39:57.017795876 +0000 UTC m=+1004.840680622" Oct 08 22:39:57 crc kubenswrapper[4834]: I1008 22:39:57.021655 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" podStartSLOduration=5.279584121 podStartE2EDuration="17.021628669s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.86702137 +0000 UTC m=+990.689906116" lastFinishedPulling="2025-10-08 22:39:54.609065878 +0000 UTC m=+1002.431950664" observedRunningTime="2025-10-08 22:39:57.018711699 +0000 UTC m=+1004.841596445" watchObservedRunningTime="2025-10-08 22:39:57.021628669 +0000 UTC m=+1004.844513415" Oct 08 22:39:57 crc kubenswrapper[4834]: I1008 22:39:57.041900 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-8fbc8" podStartSLOduration=4.322614989 podStartE2EDuration="17.04187612s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:41.966115307 +0000 UTC m=+989.789000053" lastFinishedPulling="2025-10-08 22:39:54.685376418 +0000 UTC m=+1002.508261184" observedRunningTime="2025-10-08 22:39:57.035648799 +0000 UTC m=+1004.858533565" watchObservedRunningTime="2025-10-08 22:39:57.04187612 +0000 UTC m=+1004.864760866" Oct 08 22:39:57 crc kubenswrapper[4834]: I1008 22:39:57.086944 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-q2qld" podStartSLOduration=4.913594284 podStartE2EDuration="17.086921303s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.523016716 +0000 UTC m=+990.345901472" lastFinishedPulling="2025-10-08 22:39:54.696343745 +0000 UTC m=+1002.519228491" observedRunningTime="2025-10-08 22:39:57.063514985 +0000 UTC m=+1004.886399751" watchObservedRunningTime="2025-10-08 22:39:57.086921303 +0000 UTC m=+1004.909806049" Oct 08 22:39:57 crc kubenswrapper[4834]: I1008 22:39:57.088935 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ssskr" podStartSLOduration=4.992694273 podStartE2EDuration="17.088913201s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.529085663 +0000 UTC m=+990.351970409" lastFinishedPulling="2025-10-08 22:39:54.625304571 +0000 UTC m=+1002.448189337" observedRunningTime="2025-10-08 22:39:57.083986581 +0000 UTC m=+1004.906871327" watchObservedRunningTime="2025-10-08 22:39:57.088913201 +0000 UTC m=+1004.911797937" Oct 08 22:40:00 crc kubenswrapper[4834]: I1008 22:40:00.890903 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-dvbzh" Oct 08 22:40:00 crc kubenswrapper[4834]: I1008 22:40:00.904857 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-rw77z" event={"ID":"6a282f5d-5a52-4a80-bf82-6eea47007564","Type":"ContainerStarted","Data":"2fbe65dc56da7fc2dd6f067ba02a78ce2b9852ce725ceb8549261c45d6849868"} Oct 08 22:40:00 crc kubenswrapper[4834]: I1008 22:40:00.904972 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-8fbc8" Oct 08 22:40:00 crc kubenswrapper[4834]: I1008 22:40:00.922494 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw" event={"ID":"0f36bacd-60f1-41f9-a0e9-1429cecade32","Type":"ContainerStarted","Data":"628931a48d5599c70011e2bf1d97653c5af11d8c17e70f2ecd8067855afa3408"} Oct 08 22:40:00 crc kubenswrapper[4834]: I1008 22:40:00.924713 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74" event={"ID":"95926917-39a4-4757-b246-874a680f97ce","Type":"ContainerStarted","Data":"d328c00b2d704539f8d193494c64810e26298c6a8a6c2b5f5452792a5686829f"} Oct 08 22:40:00 crc kubenswrapper[4834]: I1008 22:40:00.925349 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74" Oct 08 22:40:00 crc kubenswrapper[4834]: I1008 22:40:00.935363 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk" event={"ID":"f8aea845-c4c0-469f-ac39-9f4525e69ec5","Type":"ContainerStarted","Data":"f2bc83e76edb543621307fa380903d4434e24bf238565066fb692108eaba84e1"} Oct 08 22:40:00 crc kubenswrapper[4834]: I1008 22:40:00.936298 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk" Oct 08 22:40:00 crc kubenswrapper[4834]: I1008 22:40:00.944181 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-8fhdd" Oct 08 22:40:00 crc kubenswrapper[4834]: I1008 22:40:00.945130 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-nf4vz" Oct 08 22:40:00 crc kubenswrapper[4834]: I1008 22:40:00.961463 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw" podStartSLOduration=3.035672288 podStartE2EDuration="19.961431677s" podCreationTimestamp="2025-10-08 22:39:41 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.763511049 +0000 UTC m=+990.586395795" lastFinishedPulling="2025-10-08 22:39:59.689270408 +0000 UTC m=+1007.512155184" observedRunningTime="2025-10-08 22:40:00.951120806 +0000 UTC m=+1008.774005552" watchObservedRunningTime="2025-10-08 22:40:00.961431677 +0000 UTC m=+1008.784316443" Oct 08 22:40:00 crc kubenswrapper[4834]: I1008 22:40:00.977360 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5df598886f-rw77z" podStartSLOduration=3.937249329 podStartE2EDuration="20.977345092s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.658322627 +0000 UTC m=+990.481207373" lastFinishedPulling="2025-10-08 22:39:59.69841836 +0000 UTC m=+1007.521303136" observedRunningTime="2025-10-08 22:40:00.969642515 +0000 UTC m=+1008.792527271" watchObservedRunningTime="2025-10-08 22:40:00.977345092 +0000 UTC m=+1008.800229838" Oct 08 22:40:00 crc kubenswrapper[4834]: I1008 22:40:00.990282 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74" podStartSLOduration=4.041853288 podStartE2EDuration="20.990263015s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.763854408 +0000 UTC m=+990.586739154" lastFinishedPulling="2025-10-08 22:39:59.712264105 +0000 UTC m=+1007.535148881" observedRunningTime="2025-10-08 22:40:00.983772719 +0000 UTC m=+1008.806657475" watchObservedRunningTime="2025-10-08 22:40:00.990263015 +0000 UTC m=+1008.813147781" Oct 08 22:40:01 crc kubenswrapper[4834]: I1008 22:40:01.026100 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-xprwf" Oct 08 22:40:01 crc kubenswrapper[4834]: I1008 22:40:01.037259 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-lwsbk" Oct 08 22:40:01 crc kubenswrapper[4834]: I1008 22:40:01.074480 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-ssskr" Oct 08 22:40:01 crc kubenswrapper[4834]: I1008 22:40:01.082763 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk" podStartSLOduration=4.283170651 podStartE2EDuration="21.082737859s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.88806944 +0000 UTC m=+990.710954186" lastFinishedPulling="2025-10-08 22:39:59.687636638 +0000 UTC m=+1007.510521394" observedRunningTime="2025-10-08 22:40:01.075930563 +0000 UTC m=+1008.898815309" watchObservedRunningTime="2025-10-08 22:40:01.082737859 +0000 UTC m=+1008.905622605" Oct 08 22:40:01 crc kubenswrapper[4834]: I1008 22:40:01.195904 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-z5gzx" Oct 08 22:40:01 crc kubenswrapper[4834]: I1008 22:40:01.230671 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-q2qld" Oct 08 22:40:01 crc kubenswrapper[4834]: I1008 22:40:01.348141 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5df598886f-rw77z" Oct 08 22:40:01 crc kubenswrapper[4834]: I1008 22:40:01.350394 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-7mh2r" Oct 08 22:40:01 crc kubenswrapper[4834]: I1008 22:40:01.379607 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-jkmf4" Oct 08 22:40:01 crc kubenswrapper[4834]: I1008 22:40:01.454751 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-79db49b9fb-q9bmt" Oct 08 22:40:01 crc kubenswrapper[4834]: I1008 22:40:01.472957 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-6r5r9" Oct 08 22:40:01 crc kubenswrapper[4834]: I1008 22:40:01.698560 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-dkc8b" Oct 08 22:40:02 crc kubenswrapper[4834]: I1008 22:40:02.022103 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm" Oct 08 22:40:02 crc kubenswrapper[4834]: I1008 22:40:02.575429 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-mxst7" Oct 08 22:40:11 crc kubenswrapper[4834]: I1008 22:40:11.357326 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5df598886f-rw77z" Oct 08 22:40:11 crc kubenswrapper[4834]: I1008 22:40:11.502597 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-vxg74" Oct 08 22:40:11 crc kubenswrapper[4834]: I1008 22:40:11.655785 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-zqxbk" Oct 08 22:40:12 crc kubenswrapper[4834]: I1008 22:40:12.024399 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8" event={"ID":"6c2fa94f-b7fc-496b-a2f4-81695f1d86b2","Type":"ContainerStarted","Data":"f2e1ec185587b3090533dea00750f167ee99fe63bdb34133e052fa6f3c61c1d4"} Oct 08 22:40:12 crc kubenswrapper[4834]: I1008 22:40:12.024985 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8" Oct 08 22:40:12 crc kubenswrapper[4834]: I1008 22:40:12.028178 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-phcvn" event={"ID":"8e0f70ea-9d3e-4ada-834a-1134f8485204","Type":"ContainerStarted","Data":"fe7dc0918c631e2102f1f41540775d5478f1b54d42f8a97f1f31666b910ac269"} Oct 08 22:40:12 crc kubenswrapper[4834]: I1008 22:40:12.028483 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5458f77c4-phcvn" Oct 08 22:40:12 crc kubenswrapper[4834]: I1008 22:40:12.046649 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8" podStartSLOduration=3.428739325 podStartE2EDuration="32.046619071s" podCreationTimestamp="2025-10-08 22:39:40 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.914649045 +0000 UTC m=+990.737533791" lastFinishedPulling="2025-10-08 22:40:11.532528771 +0000 UTC m=+1019.355413537" observedRunningTime="2025-10-08 22:40:12.041577068 +0000 UTC m=+1019.864461824" watchObservedRunningTime="2025-10-08 22:40:12.046619071 +0000 UTC m=+1019.869503847" Oct 08 22:40:12 crc kubenswrapper[4834]: I1008 22:40:12.062496 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5458f77c4-phcvn" podStartSLOduration=2.293575676 podStartE2EDuration="31.062474845s" podCreationTimestamp="2025-10-08 22:39:41 +0000 UTC" firstStartedPulling="2025-10-08 22:39:42.763627322 +0000 UTC m=+990.586512068" lastFinishedPulling="2025-10-08 22:40:11.532526481 +0000 UTC m=+1019.355411237" observedRunningTime="2025-10-08 22:40:12.057261559 +0000 UTC m=+1019.880146315" watchObservedRunningTime="2025-10-08 22:40:12.062474845 +0000 UTC m=+1019.885359581" Oct 08 22:40:21 crc kubenswrapper[4834]: I1008 22:40:21.518942 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-7lzg8" Oct 08 22:40:21 crc kubenswrapper[4834]: I1008 22:40:21.554018 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5458f77c4-phcvn" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.184319 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-tx5lv"] Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.189736 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-tx5lv" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.194482 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.194635 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.194656 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-trwfp" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.194741 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.211707 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-tx5lv"] Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.261199 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-l479j"] Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.262650 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-l479j" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.265178 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.282054 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-l479j"] Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.327114 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e53c46-ad91-4dd7-8288-d4f9c362a180-config\") pod \"dnsmasq-dns-7bfcb9d745-tx5lv\" (UID: \"11e53c46-ad91-4dd7-8288-d4f9c362a180\") " pod="openstack/dnsmasq-dns-7bfcb9d745-tx5lv" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.327323 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxhdv\" (UniqueName: \"kubernetes.io/projected/11e53c46-ad91-4dd7-8288-d4f9c362a180-kube-api-access-dxhdv\") pod \"dnsmasq-dns-7bfcb9d745-tx5lv\" (UID: \"11e53c46-ad91-4dd7-8288-d4f9c362a180\") " pod="openstack/dnsmasq-dns-7bfcb9d745-tx5lv" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.429539 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn5jk\" (UniqueName: \"kubernetes.io/projected/73eef7aa-916e-4c41-b86a-27cf30ef8633-kube-api-access-bn5jk\") pod \"dnsmasq-dns-758b79db4c-l479j\" (UID: \"73eef7aa-916e-4c41-b86a-27cf30ef8633\") " pod="openstack/dnsmasq-dns-758b79db4c-l479j" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.429643 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73eef7aa-916e-4c41-b86a-27cf30ef8633-dns-svc\") pod \"dnsmasq-dns-758b79db4c-l479j\" (UID: \"73eef7aa-916e-4c41-b86a-27cf30ef8633\") " pod="openstack/dnsmasq-dns-758b79db4c-l479j" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.429687 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73eef7aa-916e-4c41-b86a-27cf30ef8633-config\") pod \"dnsmasq-dns-758b79db4c-l479j\" (UID: \"73eef7aa-916e-4c41-b86a-27cf30ef8633\") " pod="openstack/dnsmasq-dns-758b79db4c-l479j" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.429743 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e53c46-ad91-4dd7-8288-d4f9c362a180-config\") pod \"dnsmasq-dns-7bfcb9d745-tx5lv\" (UID: \"11e53c46-ad91-4dd7-8288-d4f9c362a180\") " pod="openstack/dnsmasq-dns-7bfcb9d745-tx5lv" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.429821 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxhdv\" (UniqueName: \"kubernetes.io/projected/11e53c46-ad91-4dd7-8288-d4f9c362a180-kube-api-access-dxhdv\") pod \"dnsmasq-dns-7bfcb9d745-tx5lv\" (UID: \"11e53c46-ad91-4dd7-8288-d4f9c362a180\") " pod="openstack/dnsmasq-dns-7bfcb9d745-tx5lv" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.431828 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e53c46-ad91-4dd7-8288-d4f9c362a180-config\") pod \"dnsmasq-dns-7bfcb9d745-tx5lv\" (UID: \"11e53c46-ad91-4dd7-8288-d4f9c362a180\") " pod="openstack/dnsmasq-dns-7bfcb9d745-tx5lv" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.453674 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxhdv\" (UniqueName: \"kubernetes.io/projected/11e53c46-ad91-4dd7-8288-d4f9c362a180-kube-api-access-dxhdv\") pod \"dnsmasq-dns-7bfcb9d745-tx5lv\" (UID: \"11e53c46-ad91-4dd7-8288-d4f9c362a180\") " pod="openstack/dnsmasq-dns-7bfcb9d745-tx5lv" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.514972 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-tx5lv" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.531047 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn5jk\" (UniqueName: \"kubernetes.io/projected/73eef7aa-916e-4c41-b86a-27cf30ef8633-kube-api-access-bn5jk\") pod \"dnsmasq-dns-758b79db4c-l479j\" (UID: \"73eef7aa-916e-4c41-b86a-27cf30ef8633\") " pod="openstack/dnsmasq-dns-758b79db4c-l479j" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.531112 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73eef7aa-916e-4c41-b86a-27cf30ef8633-dns-svc\") pod \"dnsmasq-dns-758b79db4c-l479j\" (UID: \"73eef7aa-916e-4c41-b86a-27cf30ef8633\") " pod="openstack/dnsmasq-dns-758b79db4c-l479j" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.531141 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73eef7aa-916e-4c41-b86a-27cf30ef8633-config\") pod \"dnsmasq-dns-758b79db4c-l479j\" (UID: \"73eef7aa-916e-4c41-b86a-27cf30ef8633\") " pod="openstack/dnsmasq-dns-758b79db4c-l479j" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.532376 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73eef7aa-916e-4c41-b86a-27cf30ef8633-config\") pod \"dnsmasq-dns-758b79db4c-l479j\" (UID: \"73eef7aa-916e-4c41-b86a-27cf30ef8633\") " pod="openstack/dnsmasq-dns-758b79db4c-l479j" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.533461 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73eef7aa-916e-4c41-b86a-27cf30ef8633-dns-svc\") pod \"dnsmasq-dns-758b79db4c-l479j\" (UID: \"73eef7aa-916e-4c41-b86a-27cf30ef8633\") " pod="openstack/dnsmasq-dns-758b79db4c-l479j" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.573328 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn5jk\" (UniqueName: \"kubernetes.io/projected/73eef7aa-916e-4c41-b86a-27cf30ef8633-kube-api-access-bn5jk\") pod \"dnsmasq-dns-758b79db4c-l479j\" (UID: \"73eef7aa-916e-4c41-b86a-27cf30ef8633\") " pod="openstack/dnsmasq-dns-758b79db4c-l479j" Oct 08 22:40:36 crc kubenswrapper[4834]: I1008 22:40:36.589087 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-l479j" Oct 08 22:40:37 crc kubenswrapper[4834]: I1008 22:40:37.021245 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-tx5lv"] Oct 08 22:40:37 crc kubenswrapper[4834]: I1008 22:40:37.093129 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-l479j"] Oct 08 22:40:37 crc kubenswrapper[4834]: W1008 22:40:37.096867 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73eef7aa_916e_4c41_b86a_27cf30ef8633.slice/crio-6505c828d1053258e390ae6fa196f1eddc338f6eac4c31808f7f7a163b0162e4 WatchSource:0}: Error finding container 6505c828d1053258e390ae6fa196f1eddc338f6eac4c31808f7f7a163b0162e4: Status 404 returned error can't find the container with id 6505c828d1053258e390ae6fa196f1eddc338f6eac4c31808f7f7a163b0162e4 Oct 08 22:40:37 crc kubenswrapper[4834]: I1008 22:40:37.249915 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfcb9d745-tx5lv" event={"ID":"11e53c46-ad91-4dd7-8288-d4f9c362a180","Type":"ContainerStarted","Data":"0b149bf1e54e62d3eb8888cb5f57af3b547f47c2aabd1775dad88d83b364f7e7"} Oct 08 22:40:37 crc kubenswrapper[4834]: I1008 22:40:37.251868 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b79db4c-l479j" event={"ID":"73eef7aa-916e-4c41-b86a-27cf30ef8633","Type":"ContainerStarted","Data":"6505c828d1053258e390ae6fa196f1eddc338f6eac4c31808f7f7a163b0162e4"} Oct 08 22:40:38 crc kubenswrapper[4834]: I1008 22:40:38.763717 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-l479j"] Oct 08 22:40:38 crc kubenswrapper[4834]: I1008 22:40:38.778654 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-644597f84c-46l7c"] Oct 08 22:40:38 crc kubenswrapper[4834]: I1008 22:40:38.781574 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-46l7c" Oct 08 22:40:38 crc kubenswrapper[4834]: I1008 22:40:38.794965 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-46l7c"] Oct 08 22:40:38 crc kubenswrapper[4834]: I1008 22:40:38.865610 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44e528c2-5eb9-465e-8df9-012865d20ced-dns-svc\") pod \"dnsmasq-dns-644597f84c-46l7c\" (UID: \"44e528c2-5eb9-465e-8df9-012865d20ced\") " pod="openstack/dnsmasq-dns-644597f84c-46l7c" Oct 08 22:40:38 crc kubenswrapper[4834]: I1008 22:40:38.865728 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n74d6\" (UniqueName: \"kubernetes.io/projected/44e528c2-5eb9-465e-8df9-012865d20ced-kube-api-access-n74d6\") pod \"dnsmasq-dns-644597f84c-46l7c\" (UID: \"44e528c2-5eb9-465e-8df9-012865d20ced\") " pod="openstack/dnsmasq-dns-644597f84c-46l7c" Oct 08 22:40:38 crc kubenswrapper[4834]: I1008 22:40:38.865838 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44e528c2-5eb9-465e-8df9-012865d20ced-config\") pod \"dnsmasq-dns-644597f84c-46l7c\" (UID: \"44e528c2-5eb9-465e-8df9-012865d20ced\") " pod="openstack/dnsmasq-dns-644597f84c-46l7c" Oct 08 22:40:38 crc kubenswrapper[4834]: I1008 22:40:38.967973 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44e528c2-5eb9-465e-8df9-012865d20ced-dns-svc\") pod \"dnsmasq-dns-644597f84c-46l7c\" (UID: \"44e528c2-5eb9-465e-8df9-012865d20ced\") " pod="openstack/dnsmasq-dns-644597f84c-46l7c" Oct 08 22:40:38 crc kubenswrapper[4834]: I1008 22:40:38.968034 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n74d6\" (UniqueName: \"kubernetes.io/projected/44e528c2-5eb9-465e-8df9-012865d20ced-kube-api-access-n74d6\") pod \"dnsmasq-dns-644597f84c-46l7c\" (UID: \"44e528c2-5eb9-465e-8df9-012865d20ced\") " pod="openstack/dnsmasq-dns-644597f84c-46l7c" Oct 08 22:40:38 crc kubenswrapper[4834]: I1008 22:40:38.968092 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44e528c2-5eb9-465e-8df9-012865d20ced-config\") pod \"dnsmasq-dns-644597f84c-46l7c\" (UID: \"44e528c2-5eb9-465e-8df9-012865d20ced\") " pod="openstack/dnsmasq-dns-644597f84c-46l7c" Oct 08 22:40:38 crc kubenswrapper[4834]: I1008 22:40:38.968971 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44e528c2-5eb9-465e-8df9-012865d20ced-config\") pod \"dnsmasq-dns-644597f84c-46l7c\" (UID: \"44e528c2-5eb9-465e-8df9-012865d20ced\") " pod="openstack/dnsmasq-dns-644597f84c-46l7c" Oct 08 22:40:38 crc kubenswrapper[4834]: I1008 22:40:38.969092 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44e528c2-5eb9-465e-8df9-012865d20ced-dns-svc\") pod \"dnsmasq-dns-644597f84c-46l7c\" (UID: \"44e528c2-5eb9-465e-8df9-012865d20ced\") " pod="openstack/dnsmasq-dns-644597f84c-46l7c" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.026828 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n74d6\" (UniqueName: \"kubernetes.io/projected/44e528c2-5eb9-465e-8df9-012865d20ced-kube-api-access-n74d6\") pod \"dnsmasq-dns-644597f84c-46l7c\" (UID: \"44e528c2-5eb9-465e-8df9-012865d20ced\") " pod="openstack/dnsmasq-dns-644597f84c-46l7c" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.055518 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-tx5lv"] Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.091631 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77597f887-jwkgj"] Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.093530 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.102216 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77597f887-jwkgj"] Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.102587 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-46l7c" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.175868 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27ef60d6-9b76-4b61-9c92-75fa394546a0-dns-svc\") pod \"dnsmasq-dns-77597f887-jwkgj\" (UID: \"27ef60d6-9b76-4b61-9c92-75fa394546a0\") " pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.176836 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zvv9\" (UniqueName: \"kubernetes.io/projected/27ef60d6-9b76-4b61-9c92-75fa394546a0-kube-api-access-9zvv9\") pod \"dnsmasq-dns-77597f887-jwkgj\" (UID: \"27ef60d6-9b76-4b61-9c92-75fa394546a0\") " pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.176939 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27ef60d6-9b76-4b61-9c92-75fa394546a0-config\") pod \"dnsmasq-dns-77597f887-jwkgj\" (UID: \"27ef60d6-9b76-4b61-9c92-75fa394546a0\") " pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.279443 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zvv9\" (UniqueName: \"kubernetes.io/projected/27ef60d6-9b76-4b61-9c92-75fa394546a0-kube-api-access-9zvv9\") pod \"dnsmasq-dns-77597f887-jwkgj\" (UID: \"27ef60d6-9b76-4b61-9c92-75fa394546a0\") " pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.279540 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27ef60d6-9b76-4b61-9c92-75fa394546a0-config\") pod \"dnsmasq-dns-77597f887-jwkgj\" (UID: \"27ef60d6-9b76-4b61-9c92-75fa394546a0\") " pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.279623 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27ef60d6-9b76-4b61-9c92-75fa394546a0-dns-svc\") pod \"dnsmasq-dns-77597f887-jwkgj\" (UID: \"27ef60d6-9b76-4b61-9c92-75fa394546a0\") " pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.280844 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27ef60d6-9b76-4b61-9c92-75fa394546a0-dns-svc\") pod \"dnsmasq-dns-77597f887-jwkgj\" (UID: \"27ef60d6-9b76-4b61-9c92-75fa394546a0\") " pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.283983 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27ef60d6-9b76-4b61-9c92-75fa394546a0-config\") pod \"dnsmasq-dns-77597f887-jwkgj\" (UID: \"27ef60d6-9b76-4b61-9c92-75fa394546a0\") " pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.303806 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zvv9\" (UniqueName: \"kubernetes.io/projected/27ef60d6-9b76-4b61-9c92-75fa394546a0-kube-api-access-9zvv9\") pod \"dnsmasq-dns-77597f887-jwkgj\" (UID: \"27ef60d6-9b76-4b61-9c92-75fa394546a0\") " pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.412952 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.618566 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-46l7c"] Oct 08 22:40:39 crc kubenswrapper[4834]: W1008 22:40:39.643546 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44e528c2_5eb9_465e_8df9_012865d20ced.slice/crio-e5da390df93eb02730afa36baf815a888c6844ae5c5dec8bf3ec0f6d3e1a7e86 WatchSource:0}: Error finding container e5da390df93eb02730afa36baf815a888c6844ae5c5dec8bf3ec0f6d3e1a7e86: Status 404 returned error can't find the container with id e5da390df93eb02730afa36baf815a888c6844ae5c5dec8bf3ec0f6d3e1a7e86 Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.651954 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.916607 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77597f887-jwkgj"] Oct 08 22:40:39 crc kubenswrapper[4834]: W1008 22:40:39.923928 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27ef60d6_9b76_4b61_9c92_75fa394546a0.slice/crio-d29cabfdcec3c05e6ad3f1bd1c705f5a11e2fd028f36fd84f0de66d0a2b2cdd7 WatchSource:0}: Error finding container d29cabfdcec3c05e6ad3f1bd1c705f5a11e2fd028f36fd84f0de66d0a2b2cdd7: Status 404 returned error can't find the container with id d29cabfdcec3c05e6ad3f1bd1c705f5a11e2fd028f36fd84f0de66d0a2b2cdd7 Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.934710 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.935943 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.939156 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.939314 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.939435 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.939612 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.939741 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.939867 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-h2mrc" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.940739 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 08 22:40:39 crc kubenswrapper[4834]: I1008 22:40:39.961633 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.095069 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.095121 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.095157 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.095185 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08a7721f-38a1-4a82-88ed-6f70290b5a6d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.095313 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.095407 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvbvj\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-kube-api-access-fvbvj\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.095496 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.095536 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.095561 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.095587 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08a7721f-38a1-4a82-88ed-6f70290b5a6d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.095608 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.196888 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.196930 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvbvj\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-kube-api-access-fvbvj\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.196976 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.197005 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.197028 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.197055 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08a7721f-38a1-4a82-88ed-6f70290b5a6d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.197074 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.197101 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.197123 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.197174 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.197201 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08a7721f-38a1-4a82-88ed-6f70290b5a6d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.197884 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.198113 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.198269 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.198319 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.198354 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.199256 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.213183 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.216640 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.216998 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08a7721f-38a1-4a82-88ed-6f70290b5a6d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.217574 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08a7721f-38a1-4a82-88ed-6f70290b5a6d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.218112 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvbvj\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-kube-api-access-fvbvj\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.218909 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.220458 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.221913 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.222459 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.222624 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.222785 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.222956 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.223136 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rct46" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.223283 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.235891 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.249294 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.266573 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.286007 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-46l7c" event={"ID":"44e528c2-5eb9-465e-8df9-012865d20ced","Type":"ContainerStarted","Data":"e5da390df93eb02730afa36baf815a888c6844ae5c5dec8bf3ec0f6d3e1a7e86"} Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.287115 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-jwkgj" event={"ID":"27ef60d6-9b76-4b61-9c92-75fa394546a0","Type":"ContainerStarted","Data":"d29cabfdcec3c05e6ad3f1bd1c705f5a11e2fd028f36fd84f0de66d0a2b2cdd7"} Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.299207 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.299267 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.299305 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.299327 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.299363 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.299388 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.299422 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.299441 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.299461 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh2nl\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-kube-api-access-fh2nl\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.299480 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.299607 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.401131 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.401502 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.401525 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh2nl\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-kube-api-access-fh2nl\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.401545 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.401563 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.401601 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.401631 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.401658 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.401676 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.401708 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.401727 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.402652 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.403095 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.403443 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.405301 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.406052 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.410064 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.410271 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.414284 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.416615 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.417282 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh2nl\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-kube-api-access-fh2nl\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.417334 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.430104 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.644379 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:40:40 crc kubenswrapper[4834]: I1008 22:40:40.741842 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.875040 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.880796 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.882562 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.883053 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-jwr8d" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.883462 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.884708 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.887358 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.890980 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.907697 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.953795 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.953838 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/34aacb58-3b8d-466d-9b71-e7098b95fe8e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.953856 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-secrets\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.953899 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-config-data-default\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.953924 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.953967 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjvsh\" (UniqueName: \"kubernetes.io/projected/34aacb58-3b8d-466d-9b71-e7098b95fe8e-kube-api-access-bjvsh\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.954019 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.954058 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-kolla-config\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:42 crc kubenswrapper[4834]: I1008 22:40:42.954124 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.002363 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.003946 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.007670 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.007900 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.007949 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.007964 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-xbdnd" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.008162 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.062258 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.064209 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.064612 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.066253 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.066422 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/34aacb58-3b8d-466d-9b71-e7098b95fe8e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.066776 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-secrets\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.066635 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.066743 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/34aacb58-3b8d-466d-9b71-e7098b95fe8e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.067110 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-config-data-default\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.067330 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.067568 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/894c1f04-42d4-43de-a34a-19200ceec426-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.067733 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjvsh\" (UniqueName: \"kubernetes.io/projected/34aacb58-3b8d-466d-9b71-e7098b95fe8e-kube-api-access-bjvsh\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.068408 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.068852 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.068966 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-config-data-default\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.068986 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.069175 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.069408 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.069446 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-kolla-config\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.069530 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjnjf\" (UniqueName: \"kubernetes.io/projected/894c1f04-42d4-43de-a34a-19200ceec426-kube-api-access-kjnjf\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.069593 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.069625 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.070274 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-kolla-config\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.076374 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.076380 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.080830 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-secrets\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.084335 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjvsh\" (UniqueName: \"kubernetes.io/projected/34aacb58-3b8d-466d-9b71-e7098b95fe8e-kube-api-access-bjvsh\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.103705 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.171369 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/894c1f04-42d4-43de-a34a-19200ceec426-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.171421 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.171442 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.171461 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.171482 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.171597 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjnjf\" (UniqueName: \"kubernetes.io/projected/894c1f04-42d4-43de-a34a-19200ceec426-kube-api-access-kjnjf\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.171622 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.171647 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.171681 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.171876 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/894c1f04-42d4-43de-a34a-19200ceec426-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.172537 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.172591 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.172877 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.173233 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.178901 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.179077 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.181838 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.189437 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjnjf\" (UniqueName: \"kubernetes.io/projected/894c1f04-42d4-43de-a34a-19200ceec426-kube-api-access-kjnjf\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.192106 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.210059 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.369247 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.432997 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.433939 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.440455 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.440583 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.440924 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-zw8c4" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.447788 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.577396 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnqk2\" (UniqueName: \"kubernetes.io/projected/168e9a74-197a-4210-a553-7162c2f521af-kube-api-access-pnqk2\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.577744 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/168e9a74-197a-4210-a553-7162c2f521af-config-data\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.577855 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168e9a74-197a-4210-a553-7162c2f521af-combined-ca-bundle\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.578085 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/168e9a74-197a-4210-a553-7162c2f521af-memcached-tls-certs\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.578284 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/168e9a74-197a-4210-a553-7162c2f521af-kolla-config\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.680921 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnqk2\" (UniqueName: \"kubernetes.io/projected/168e9a74-197a-4210-a553-7162c2f521af-kube-api-access-pnqk2\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.680978 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/168e9a74-197a-4210-a553-7162c2f521af-config-data\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.680996 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168e9a74-197a-4210-a553-7162c2f521af-combined-ca-bundle\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.681136 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/168e9a74-197a-4210-a553-7162c2f521af-memcached-tls-certs\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.681200 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/168e9a74-197a-4210-a553-7162c2f521af-kolla-config\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.681924 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/168e9a74-197a-4210-a553-7162c2f521af-kolla-config\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.682923 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/168e9a74-197a-4210-a553-7162c2f521af-config-data\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.686883 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168e9a74-197a-4210-a553-7162c2f521af-combined-ca-bundle\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.690696 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/168e9a74-197a-4210-a553-7162c2f521af-memcached-tls-certs\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.712222 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnqk2\" (UniqueName: \"kubernetes.io/projected/168e9a74-197a-4210-a553-7162c2f521af-kube-api-access-pnqk2\") pod \"memcached-0\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " pod="openstack/memcached-0" Oct 08 22:40:43 crc kubenswrapper[4834]: I1008 22:40:43.751164 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 22:40:44 crc kubenswrapper[4834]: W1008 22:40:44.893506 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08a7721f_38a1_4a82_88ed_6f70290b5a6d.slice/crio-2ef2b6b302292227b34f6c27277b8625c533333fc59fa4b246d88e8be45f12ad WatchSource:0}: Error finding container 2ef2b6b302292227b34f6c27277b8625c533333fc59fa4b246d88e8be45f12ad: Status 404 returned error can't find the container with id 2ef2b6b302292227b34f6c27277b8625c533333fc59fa4b246d88e8be45f12ad Oct 08 22:40:45 crc kubenswrapper[4834]: I1008 22:40:45.057895 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:40:45 crc kubenswrapper[4834]: I1008 22:40:45.059309 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:40:45 crc kubenswrapper[4834]: I1008 22:40:45.072000 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-lk5gw" Oct 08 22:40:45 crc kubenswrapper[4834]: I1008 22:40:45.088358 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:40:45 crc kubenswrapper[4834]: I1008 22:40:45.209943 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2kf6\" (UniqueName: \"kubernetes.io/projected/53daad82-5614-475c-b3c0-95329fc7d46a-kube-api-access-t2kf6\") pod \"kube-state-metrics-0\" (UID: \"53daad82-5614-475c-b3c0-95329fc7d46a\") " pod="openstack/kube-state-metrics-0" Oct 08 22:40:45 crc kubenswrapper[4834]: I1008 22:40:45.311159 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2kf6\" (UniqueName: \"kubernetes.io/projected/53daad82-5614-475c-b3c0-95329fc7d46a-kube-api-access-t2kf6\") pod \"kube-state-metrics-0\" (UID: \"53daad82-5614-475c-b3c0-95329fc7d46a\") " pod="openstack/kube-state-metrics-0" Oct 08 22:40:45 crc kubenswrapper[4834]: I1008 22:40:45.325361 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"08a7721f-38a1-4a82-88ed-6f70290b5a6d","Type":"ContainerStarted","Data":"2ef2b6b302292227b34f6c27277b8625c533333fc59fa4b246d88e8be45f12ad"} Oct 08 22:40:45 crc kubenswrapper[4834]: I1008 22:40:45.336322 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2kf6\" (UniqueName: \"kubernetes.io/projected/53daad82-5614-475c-b3c0-95329fc7d46a-kube-api-access-t2kf6\") pod \"kube-state-metrics-0\" (UID: \"53daad82-5614-475c-b3c0-95329fc7d46a\") " pod="openstack/kube-state-metrics-0" Oct 08 22:40:45 crc kubenswrapper[4834]: I1008 22:40:45.385105 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.851828 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7bm6j"] Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.853775 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.855610 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.859603 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.859714 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-xcxkn" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.862939 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7bm6j"] Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.869610 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-jmkzz"] Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.871835 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.876129 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jmkzz"] Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.944281 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.974837 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-lib\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.974888 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-run-ovn\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.974914 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74f0068c-4e61-4079-9d62-b338472e817d-scripts\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.974935 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-log\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.974950 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-log-ovn\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.974975 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77e43c87-585d-4d7c-bd16-ab66b531e024-scripts\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.974992 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f0068c-4e61-4079-9d62-b338472e817d-combined-ca-bundle\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.975031 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/74f0068c-4e61-4079-9d62-b338472e817d-ovn-controller-tls-certs\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.975059 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-run\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.975074 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqg8b\" (UniqueName: \"kubernetes.io/projected/77e43c87-585d-4d7c-bd16-ab66b531e024-kube-api-access-vqg8b\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.975091 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-etc-ovs\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.975117 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-run\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:48 crc kubenswrapper[4834]: I1008 22:40:48.975138 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6rtf\" (UniqueName: \"kubernetes.io/projected/74f0068c-4e61-4079-9d62-b338472e817d-kube-api-access-d6rtf\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.076433 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/74f0068c-4e61-4079-9d62-b338472e817d-ovn-controller-tls-certs\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.076517 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-run\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.076548 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqg8b\" (UniqueName: \"kubernetes.io/projected/77e43c87-585d-4d7c-bd16-ab66b531e024-kube-api-access-vqg8b\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.076576 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-etc-ovs\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.076625 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-run\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.076660 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6rtf\" (UniqueName: \"kubernetes.io/projected/74f0068c-4e61-4079-9d62-b338472e817d-kube-api-access-d6rtf\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.076714 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-lib\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.076738 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-run-ovn\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.076767 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74f0068c-4e61-4079-9d62-b338472e817d-scripts\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.076796 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-log\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.076824 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-log-ovn\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.076858 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77e43c87-585d-4d7c-bd16-ab66b531e024-scripts\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.076885 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f0068c-4e61-4079-9d62-b338472e817d-combined-ca-bundle\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.077865 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-lib\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.078043 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-run\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.078671 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-etc-ovs\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.078818 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-run-ovn\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.079074 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-log-ovn\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.078889 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-log\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.082853 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74f0068c-4e61-4079-9d62-b338472e817d-scripts\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.083365 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/74f0068c-4e61-4079-9d62-b338472e817d-ovn-controller-tls-certs\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.088544 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77e43c87-585d-4d7c-bd16-ab66b531e024-scripts\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.089665 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-run\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.090652 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f0068c-4e61-4079-9d62-b338472e817d-combined-ca-bundle\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.098476 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqg8b\" (UniqueName: \"kubernetes.io/projected/77e43c87-585d-4d7c-bd16-ab66b531e024-kube-api-access-vqg8b\") pod \"ovn-controller-ovs-jmkzz\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.103284 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6rtf\" (UniqueName: \"kubernetes.io/projected/74f0068c-4e61-4079-9d62-b338472e817d-kube-api-access-d6rtf\") pod \"ovn-controller-7bm6j\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.181052 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7bm6j" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.190647 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.748661 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.750473 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.754907 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.755086 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.755177 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.755173 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-26qbk" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.755226 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.770481 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.891555 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.891612 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.891685 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.891728 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.891757 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.891928 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnt5k\" (UniqueName: \"kubernetes.io/projected/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-kube-api-access-dnt5k\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.892039 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-config\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.892226 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.995809 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.995883 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.995979 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.996033 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.996062 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.996085 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnt5k\" (UniqueName: \"kubernetes.io/projected/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-kube-api-access-dnt5k\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.996112 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-config\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.996242 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.996413 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.996595 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.996891 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:49 crc kubenswrapper[4834]: I1008 22:40:49.998793 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-config\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:50 crc kubenswrapper[4834]: I1008 22:40:50.004706 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:50 crc kubenswrapper[4834]: I1008 22:40:50.011455 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:50 crc kubenswrapper[4834]: I1008 22:40:50.011603 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:50 crc kubenswrapper[4834]: I1008 22:40:50.015806 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnt5k\" (UniqueName: \"kubernetes.io/projected/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-kube-api-access-dnt5k\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:50 crc kubenswrapper[4834]: I1008 22:40:50.033850 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-nb-0\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:50 crc kubenswrapper[4834]: I1008 22:40:50.089272 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.183034 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.191023 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.193087 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.193339 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.193452 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-xrxvp" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.193825 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.194686 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.240442 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/701b75e6-1acc-47d0-85de-2349a6345a3b-config\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.240489 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/701b75e6-1acc-47d0-85de-2349a6345a3b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.240513 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.240551 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.240599 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.240630 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/701b75e6-1acc-47d0-85de-2349a6345a3b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.240648 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.240675 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfvwj\" (UniqueName: \"kubernetes.io/projected/701b75e6-1acc-47d0-85de-2349a6345a3b-kube-api-access-nfvwj\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.342483 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.342542 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/701b75e6-1acc-47d0-85de-2349a6345a3b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.342570 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.342610 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfvwj\" (UniqueName: \"kubernetes.io/projected/701b75e6-1acc-47d0-85de-2349a6345a3b-kube-api-access-nfvwj\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.342692 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/701b75e6-1acc-47d0-85de-2349a6345a3b-config\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.342716 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/701b75e6-1acc-47d0-85de-2349a6345a3b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.342747 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.342787 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.343020 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.343763 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/701b75e6-1acc-47d0-85de-2349a6345a3b-config\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.344136 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/701b75e6-1acc-47d0-85de-2349a6345a3b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.344977 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/701b75e6-1acc-47d0-85de-2349a6345a3b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.346339 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.346761 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.359300 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfvwj\" (UniqueName: \"kubernetes.io/projected/701b75e6-1acc-47d0-85de-2349a6345a3b-kube-api-access-nfvwj\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.361124 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.367992 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:52 crc kubenswrapper[4834]: I1008 22:40:52.520749 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 22:40:53 crc kubenswrapper[4834]: W1008 22:40:53.467917 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod168e9a74_197a_4210_a553_7162c2f521af.slice/crio-9e459cb24d84ef24a6424c3430e6389583e448564dbe71e65be8b7d58c80e8e9 WatchSource:0}: Error finding container 9e459cb24d84ef24a6424c3430e6389583e448564dbe71e65be8b7d58c80e8e9: Status 404 returned error can't find the container with id 9e459cb24d84ef24a6424c3430e6389583e448564dbe71e65be8b7d58c80e8e9 Oct 08 22:40:54 crc kubenswrapper[4834]: I1008 22:40:54.395281 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"168e9a74-197a-4210-a553-7162c2f521af","Type":"ContainerStarted","Data":"9e459cb24d84ef24a6424c3430e6389583e448564dbe71e65be8b7d58c80e8e9"} Oct 08 22:40:57 crc kubenswrapper[4834]: E1008 22:40:57.506298 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 08 22:40:57 crc kubenswrapper[4834]: E1008 22:40:57.506794 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dxhdv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7bfcb9d745-tx5lv_openstack(11e53c46-ad91-4dd7-8288-d4f9c362a180): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:40:57 crc kubenswrapper[4834]: E1008 22:40:57.508423 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7bfcb9d745-tx5lv" podUID="11e53c46-ad91-4dd7-8288-d4f9c362a180" Oct 08 22:40:57 crc kubenswrapper[4834]: E1008 22:40:57.622615 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 08 22:40:57 crc kubenswrapper[4834]: E1008 22:40:57.622779 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n74d6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-644597f84c-46l7c_openstack(44e528c2-5eb9-465e-8df9-012865d20ced): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:40:57 crc kubenswrapper[4834]: E1008 22:40:57.629816 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-644597f84c-46l7c" podUID="44e528c2-5eb9-465e-8df9-012865d20ced" Oct 08 22:40:57 crc kubenswrapper[4834]: E1008 22:40:57.649224 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 08 22:40:57 crc kubenswrapper[4834]: E1008 22:40:57.649407 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bn5jk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-758b79db4c-l479j_openstack(73eef7aa-916e-4c41-b86a-27cf30ef8633): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:40:57 crc kubenswrapper[4834]: E1008 22:40:57.650545 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-758b79db4c-l479j" podUID="73eef7aa-916e-4c41-b86a-27cf30ef8633" Oct 08 22:40:57 crc kubenswrapper[4834]: E1008 22:40:57.739965 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 08 22:40:57 crc kubenswrapper[4834]: E1008 22:40:57.740128 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9zvv9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-77597f887-jwkgj_openstack(27ef60d6-9b76-4b61-9c92-75fa394546a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:40:57 crc kubenswrapper[4834]: E1008 22:40:57.741357 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-77597f887-jwkgj" podUID="27ef60d6-9b76-4b61-9c92-75fa394546a0" Oct 08 22:40:58 crc kubenswrapper[4834]: I1008 22:40:58.002030 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:40:58 crc kubenswrapper[4834]: I1008 22:40:58.136067 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 22:40:58 crc kubenswrapper[4834]: I1008 22:40:58.149805 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 22:40:58 crc kubenswrapper[4834]: I1008 22:40:58.226977 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7bm6j"] Oct 08 22:40:58 crc kubenswrapper[4834]: I1008 22:40:58.231617 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:40:58 crc kubenswrapper[4834]: I1008 22:40:58.380359 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 22:40:58 crc kubenswrapper[4834]: E1008 22:40:58.431100 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df\\\"\"" pod="openstack/dnsmasq-dns-644597f84c-46l7c" podUID="44e528c2-5eb9-465e-8df9-012865d20ced" Oct 08 22:40:58 crc kubenswrapper[4834]: E1008 22:40:58.431798 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df\\\"\"" pod="openstack/dnsmasq-dns-77597f887-jwkgj" podUID="27ef60d6-9b76-4b61-9c92-75fa394546a0" Oct 08 22:40:58 crc kubenswrapper[4834]: I1008 22:40:58.508356 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.027656 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jmkzz"] Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.444863 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"08a7721f-38a1-4a82-88ed-6f70290b5a6d","Type":"ContainerStarted","Data":"3aaaa7b4a3cb0c93a5b20371c77783e5b966e0007e153d3de4823bb1f71e3721"} Oct 08 22:40:59 crc kubenswrapper[4834]: W1008 22:40:59.575862 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53daad82_5614_475c_b3c0_95329fc7d46a.slice/crio-2362b88f1267a90efa20c80f39e46604f59242052ff670c98e29520f066ec3e1 WatchSource:0}: Error finding container 2362b88f1267a90efa20c80f39e46604f59242052ff670c98e29520f066ec3e1: Status 404 returned error can't find the container with id 2362b88f1267a90efa20c80f39e46604f59242052ff670c98e29520f066ec3e1 Oct 08 22:40:59 crc kubenswrapper[4834]: W1008 22:40:59.592883 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46d2d9d8_6fb8_45a7_bcba_5d6121b26dda.slice/crio-12e93a451c4be4fa61f58546f4fce10f4939859ee44ad0080902babf1f66944b WatchSource:0}: Error finding container 12e93a451c4be4fa61f58546f4fce10f4939859ee44ad0080902babf1f66944b: Status 404 returned error can't find the container with id 12e93a451c4be4fa61f58546f4fce10f4939859ee44ad0080902babf1f66944b Oct 08 22:40:59 crc kubenswrapper[4834]: W1008 22:40:59.593259 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34aacb58_3b8d_466d_9b71_e7098b95fe8e.slice/crio-ee16b9bf45efb2157cb54df419fb5da30fb8110333fdcccea76fdbba81208ad3 WatchSource:0}: Error finding container ee16b9bf45efb2157cb54df419fb5da30fb8110333fdcccea76fdbba81208ad3: Status 404 returned error can't find the container with id ee16b9bf45efb2157cb54df419fb5da30fb8110333fdcccea76fdbba81208ad3 Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.727045 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-tx5lv" Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.733208 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-l479j" Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.779740 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e53c46-ad91-4dd7-8288-d4f9c362a180-config\") pod \"11e53c46-ad91-4dd7-8288-d4f9c362a180\" (UID: \"11e53c46-ad91-4dd7-8288-d4f9c362a180\") " Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.779833 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73eef7aa-916e-4c41-b86a-27cf30ef8633-config\") pod \"73eef7aa-916e-4c41-b86a-27cf30ef8633\" (UID: \"73eef7aa-916e-4c41-b86a-27cf30ef8633\") " Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.779854 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73eef7aa-916e-4c41-b86a-27cf30ef8633-dns-svc\") pod \"73eef7aa-916e-4c41-b86a-27cf30ef8633\" (UID: \"73eef7aa-916e-4c41-b86a-27cf30ef8633\") " Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.779978 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn5jk\" (UniqueName: \"kubernetes.io/projected/73eef7aa-916e-4c41-b86a-27cf30ef8633-kube-api-access-bn5jk\") pod \"73eef7aa-916e-4c41-b86a-27cf30ef8633\" (UID: \"73eef7aa-916e-4c41-b86a-27cf30ef8633\") " Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.780019 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxhdv\" (UniqueName: \"kubernetes.io/projected/11e53c46-ad91-4dd7-8288-d4f9c362a180-kube-api-access-dxhdv\") pod \"11e53c46-ad91-4dd7-8288-d4f9c362a180\" (UID: \"11e53c46-ad91-4dd7-8288-d4f9c362a180\") " Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.780770 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73eef7aa-916e-4c41-b86a-27cf30ef8633-config" (OuterVolumeSpecName: "config") pod "73eef7aa-916e-4c41-b86a-27cf30ef8633" (UID: "73eef7aa-916e-4c41-b86a-27cf30ef8633"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.780820 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e53c46-ad91-4dd7-8288-d4f9c362a180-config" (OuterVolumeSpecName: "config") pod "11e53c46-ad91-4dd7-8288-d4f9c362a180" (UID: "11e53c46-ad91-4dd7-8288-d4f9c362a180"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.780817 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73eef7aa-916e-4c41-b86a-27cf30ef8633-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73eef7aa-916e-4c41-b86a-27cf30ef8633" (UID: "73eef7aa-916e-4c41-b86a-27cf30ef8633"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.784492 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e53c46-ad91-4dd7-8288-d4f9c362a180-kube-api-access-dxhdv" (OuterVolumeSpecName: "kube-api-access-dxhdv") pod "11e53c46-ad91-4dd7-8288-d4f9c362a180" (UID: "11e53c46-ad91-4dd7-8288-d4f9c362a180"). InnerVolumeSpecName "kube-api-access-dxhdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.784608 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73eef7aa-916e-4c41-b86a-27cf30ef8633-kube-api-access-bn5jk" (OuterVolumeSpecName: "kube-api-access-bn5jk") pod "73eef7aa-916e-4c41-b86a-27cf30ef8633" (UID: "73eef7aa-916e-4c41-b86a-27cf30ef8633"). InnerVolumeSpecName "kube-api-access-bn5jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.882609 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn5jk\" (UniqueName: \"kubernetes.io/projected/73eef7aa-916e-4c41-b86a-27cf30ef8633-kube-api-access-bn5jk\") on node \"crc\" DevicePath \"\"" Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.882661 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxhdv\" (UniqueName: \"kubernetes.io/projected/11e53c46-ad91-4dd7-8288-d4f9c362a180-kube-api-access-dxhdv\") on node \"crc\" DevicePath \"\"" Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.882681 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e53c46-ad91-4dd7-8288-d4f9c362a180-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.882702 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73eef7aa-916e-4c41-b86a-27cf30ef8633-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:40:59 crc kubenswrapper[4834]: I1008 22:40:59.882720 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73eef7aa-916e-4c41-b86a-27cf30ef8633-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.467395 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jmkzz" event={"ID":"77e43c87-585d-4d7c-bd16-ab66b531e024","Type":"ContainerStarted","Data":"76706c2eee92568380cfe1f5f9dc6e6ad86ac08ebce8672626360217a3030472"} Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.469272 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"53daad82-5614-475c-b3c0-95329fc7d46a","Type":"ContainerStarted","Data":"2362b88f1267a90efa20c80f39e46604f59242052ff670c98e29520f066ec3e1"} Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.471406 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7bm6j" event={"ID":"74f0068c-4e61-4079-9d62-b338472e817d","Type":"ContainerStarted","Data":"d2fb620a938936a2a1dfb9513e868a151c8061bda5d26bfe807d6cb6613e8027"} Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.473229 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b79db4c-l479j" event={"ID":"73eef7aa-916e-4c41-b86a-27cf30ef8633","Type":"ContainerDied","Data":"6505c828d1053258e390ae6fa196f1eddc338f6eac4c31808f7f7a163b0162e4"} Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.473236 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-l479j" Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.475593 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda","Type":"ContainerStarted","Data":"12e93a451c4be4fa61f58546f4fce10f4939859ee44ad0080902babf1f66944b"} Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.477273 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfcb9d745-tx5lv" event={"ID":"11e53c46-ad91-4dd7-8288-d4f9c362a180","Type":"ContainerDied","Data":"0b149bf1e54e62d3eb8888cb5f57af3b547f47c2aabd1775dad88d83b364f7e7"} Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.477360 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-tx5lv" Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.482264 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"701b75e6-1acc-47d0-85de-2349a6345a3b","Type":"ContainerStarted","Data":"a34882ae29fc24ac9c03f5a7b262b297a0dfe1f7995f4b32690e15875b764330"} Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.486915 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"894c1f04-42d4-43de-a34a-19200ceec426","Type":"ContainerStarted","Data":"3a5f11e1a18b3ab6fb6d344716492986e86282b3ecff839b1e9a50641468d1e3"} Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.489854 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9809d14f-10d2-479f-94d9-5b3ae7f49e7b","Type":"ContainerStarted","Data":"f3a013dc36f7527e1a3f86374b0da4447d80bc2da2ad93f227efdfbfd8b071d3"} Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.493079 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"34aacb58-3b8d-466d-9b71-e7098b95fe8e","Type":"ContainerStarted","Data":"ee16b9bf45efb2157cb54df419fb5da30fb8110333fdcccea76fdbba81208ad3"} Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.548003 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-l479j"] Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.562401 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-l479j"] Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.597983 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-tx5lv"] Oct 08 22:41:00 crc kubenswrapper[4834]: I1008 22:41:00.607991 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-tx5lv"] Oct 08 22:41:01 crc kubenswrapper[4834]: I1008 22:41:01.504404 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9809d14f-10d2-479f-94d9-5b3ae7f49e7b","Type":"ContainerStarted","Data":"f498e75a1c76bc1070028f242d044a99535c4ab8ce469c8317c198d7520564ee"} Oct 08 22:41:01 crc kubenswrapper[4834]: I1008 22:41:01.574989 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11e53c46-ad91-4dd7-8288-d4f9c362a180" path="/var/lib/kubelet/pods/11e53c46-ad91-4dd7-8288-d4f9c362a180/volumes" Oct 08 22:41:01 crc kubenswrapper[4834]: I1008 22:41:01.575695 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73eef7aa-916e-4c41-b86a-27cf30ef8633" path="/var/lib/kubelet/pods/73eef7aa-916e-4c41-b86a-27cf30ef8633/volumes" Oct 08 22:41:02 crc kubenswrapper[4834]: I1008 22:41:02.517233 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"168e9a74-197a-4210-a553-7162c2f521af","Type":"ContainerStarted","Data":"cc2cc2df524ac30c517a6da54a2a306187af9e9dd24a45e3d7df0f7dc25a73b1"} Oct 08 22:41:02 crc kubenswrapper[4834]: I1008 22:41:02.546232 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.618611606 podStartE2EDuration="19.546206732s" podCreationTimestamp="2025-10-08 22:40:43 +0000 UTC" firstStartedPulling="2025-10-08 22:40:53.473742196 +0000 UTC m=+1061.296626952" lastFinishedPulling="2025-10-08 22:41:01.401337322 +0000 UTC m=+1069.224222078" observedRunningTime="2025-10-08 22:41:02.54324877 +0000 UTC m=+1070.366133516" watchObservedRunningTime="2025-10-08 22:41:02.546206732 +0000 UTC m=+1070.369091488" Oct 08 22:41:03 crc kubenswrapper[4834]: I1008 22:41:03.526118 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 08 22:41:08 crc kubenswrapper[4834]: I1008 22:41:08.753630 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 08 22:41:13 crc kubenswrapper[4834]: I1008 22:41:13.618158 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"701b75e6-1acc-47d0-85de-2349a6345a3b","Type":"ContainerStarted","Data":"f5e73cfdd6ec9a2f76bfa0052b93e5d772c1194c5cd98642eb2f1e306db7ed10"} Oct 08 22:41:13 crc kubenswrapper[4834]: I1008 22:41:13.619647 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"53daad82-5614-475c-b3c0-95329fc7d46a","Type":"ContainerStarted","Data":"ca24648ca7cb1d935733fd8d6a5dae57fa7a4565f69f1abcdbe7a629bc83a777"} Oct 08 22:41:13 crc kubenswrapper[4834]: I1008 22:41:13.620098 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 08 22:41:13 crc kubenswrapper[4834]: I1008 22:41:13.620877 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7bm6j" event={"ID":"74f0068c-4e61-4079-9d62-b338472e817d","Type":"ContainerStarted","Data":"0684ef58f2b375d2dba89b1f3c7e9edf759cbd90a9d86397a6cbdc6c5ca25b8c"} Oct 08 22:41:13 crc kubenswrapper[4834]: I1008 22:41:13.621028 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-7bm6j" Oct 08 22:41:13 crc kubenswrapper[4834]: I1008 22:41:13.622904 4834 generic.go:334] "Generic (PLEG): container finished" podID="27ef60d6-9b76-4b61-9c92-75fa394546a0" containerID="529774511ef98ce5b8395344023e355c79d0748a33f2bb65da07aeb897840f7c" exitCode=0 Oct 08 22:41:13 crc kubenswrapper[4834]: I1008 22:41:13.622957 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-jwkgj" event={"ID":"27ef60d6-9b76-4b61-9c92-75fa394546a0","Type":"ContainerDied","Data":"529774511ef98ce5b8395344023e355c79d0748a33f2bb65da07aeb897840f7c"} Oct 08 22:41:13 crc kubenswrapper[4834]: I1008 22:41:13.627524 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"894c1f04-42d4-43de-a34a-19200ceec426","Type":"ContainerStarted","Data":"76243ea0ef140a340cb63efe65ec59c7b2f1673fd849f87f10bcff1ba8e296fa"} Oct 08 22:41:13 crc kubenswrapper[4834]: I1008 22:41:13.630761 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda","Type":"ContainerStarted","Data":"77fd525062b3fc27651cefd56aca9e6e25cf804fe96249c8aafd5597340b6dbd"} Oct 08 22:41:13 crc kubenswrapper[4834]: I1008 22:41:13.637701 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.290991601 podStartE2EDuration="28.637684426s" podCreationTimestamp="2025-10-08 22:40:45 +0000 UTC" firstStartedPulling="2025-10-08 22:40:59.597658787 +0000 UTC m=+1067.420543563" lastFinishedPulling="2025-10-08 22:41:12.944351632 +0000 UTC m=+1080.767236388" observedRunningTime="2025-10-08 22:41:13.633575616 +0000 UTC m=+1081.456460382" watchObservedRunningTime="2025-10-08 22:41:13.637684426 +0000 UTC m=+1081.460569172" Oct 08 22:41:13 crc kubenswrapper[4834]: I1008 22:41:13.639839 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"34aacb58-3b8d-466d-9b71-e7098b95fe8e","Type":"ContainerStarted","Data":"e01a7e96d33013eec69b7d341daec852f82c75bfd87353e3cb99da288feb48fd"} Oct 08 22:41:13 crc kubenswrapper[4834]: I1008 22:41:13.642793 4834 generic.go:334] "Generic (PLEG): container finished" podID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerID="c2e3f3cf71362e248e776b5437f379560fe494d7b88c76a620e0eb8ccd7a5466" exitCode=0 Oct 08 22:41:13 crc kubenswrapper[4834]: I1008 22:41:13.642837 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jmkzz" event={"ID":"77e43c87-585d-4d7c-bd16-ab66b531e024","Type":"ContainerDied","Data":"c2e3f3cf71362e248e776b5437f379560fe494d7b88c76a620e0eb8ccd7a5466"} Oct 08 22:41:13 crc kubenswrapper[4834]: I1008 22:41:13.674962 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-7bm6j" podStartSLOduration=14.148766944 podStartE2EDuration="25.674942774s" podCreationTimestamp="2025-10-08 22:40:48 +0000 UTC" firstStartedPulling="2025-10-08 22:40:59.575810276 +0000 UTC m=+1067.398695022" lastFinishedPulling="2025-10-08 22:41:11.101986106 +0000 UTC m=+1078.924870852" observedRunningTime="2025-10-08 22:41:13.673369855 +0000 UTC m=+1081.496254601" watchObservedRunningTime="2025-10-08 22:41:13.674942774 +0000 UTC m=+1081.497827520" Oct 08 22:41:14 crc kubenswrapper[4834]: I1008 22:41:14.651655 4834 generic.go:334] "Generic (PLEG): container finished" podID="44e528c2-5eb9-465e-8df9-012865d20ced" containerID="64fd98bb69dac7ab8ba350b5c61193e765d42f583c0863b0f4e186a2f503569a" exitCode=0 Oct 08 22:41:14 crc kubenswrapper[4834]: I1008 22:41:14.652460 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-46l7c" event={"ID":"44e528c2-5eb9-465e-8df9-012865d20ced","Type":"ContainerDied","Data":"64fd98bb69dac7ab8ba350b5c61193e765d42f583c0863b0f4e186a2f503569a"} Oct 08 22:41:14 crc kubenswrapper[4834]: I1008 22:41:14.658505 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-jwkgj" event={"ID":"27ef60d6-9b76-4b61-9c92-75fa394546a0","Type":"ContainerStarted","Data":"bf11ea7d1a1c554369b6a950592e53ca84eda39ef7853114ead35df7e1faa23e"} Oct 08 22:41:14 crc kubenswrapper[4834]: I1008 22:41:14.658701 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:41:14 crc kubenswrapper[4834]: I1008 22:41:14.663311 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jmkzz" event={"ID":"77e43c87-585d-4d7c-bd16-ab66b531e024","Type":"ContainerStarted","Data":"dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799"} Oct 08 22:41:14 crc kubenswrapper[4834]: I1008 22:41:14.663360 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jmkzz" event={"ID":"77e43c87-585d-4d7c-bd16-ab66b531e024","Type":"ContainerStarted","Data":"a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811"} Oct 08 22:41:14 crc kubenswrapper[4834]: I1008 22:41:14.707110 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-jmkzz" podStartSLOduration=15.364474638 podStartE2EDuration="26.707085019s" podCreationTimestamp="2025-10-08 22:40:48 +0000 UTC" firstStartedPulling="2025-10-08 22:40:59.601939831 +0000 UTC m=+1067.424824607" lastFinishedPulling="2025-10-08 22:41:10.944550202 +0000 UTC m=+1078.767434988" observedRunningTime="2025-10-08 22:41:14.696993983 +0000 UTC m=+1082.519878739" watchObservedRunningTime="2025-10-08 22:41:14.707085019 +0000 UTC m=+1082.529969765" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.453657 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77597f887-jwkgj" podStartSLOduration=3.438537051 podStartE2EDuration="36.453641639s" podCreationTimestamp="2025-10-08 22:40:39 +0000 UTC" firstStartedPulling="2025-10-08 22:40:39.926501167 +0000 UTC m=+1047.749385903" lastFinishedPulling="2025-10-08 22:41:12.941605735 +0000 UTC m=+1080.764490491" observedRunningTime="2025-10-08 22:41:14.715329819 +0000 UTC m=+1082.538214575" watchObservedRunningTime="2025-10-08 22:41:15.453641639 +0000 UTC m=+1083.276526385" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.456296 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-46l7c"] Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.500679 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57dc6dbc4c-jm42x"] Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.501855 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.523768 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57dc6dbc4c-jm42x"] Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.575973 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78090383-28da-4042-affd-d5324d62bef4-config\") pod \"dnsmasq-dns-57dc6dbc4c-jm42x\" (UID: \"78090383-28da-4042-affd-d5324d62bef4\") " pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.576048 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlnjf\" (UniqueName: \"kubernetes.io/projected/78090383-28da-4042-affd-d5324d62bef4-kube-api-access-nlnjf\") pod \"dnsmasq-dns-57dc6dbc4c-jm42x\" (UID: \"78090383-28da-4042-affd-d5324d62bef4\") " pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.576097 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78090383-28da-4042-affd-d5324d62bef4-dns-svc\") pod \"dnsmasq-dns-57dc6dbc4c-jm42x\" (UID: \"78090383-28da-4042-affd-d5324d62bef4\") " pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.674205 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-46l7c" event={"ID":"44e528c2-5eb9-465e-8df9-012865d20ced","Type":"ContainerStarted","Data":"a188889565e6b65f10154bc8da82638cedd8d784828f5cb0e1798acec24eeead"} Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.674433 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.674537 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.674544 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-644597f84c-46l7c" podUID="44e528c2-5eb9-465e-8df9-012865d20ced" containerName="dnsmasq-dns" containerID="cri-o://a188889565e6b65f10154bc8da82638cedd8d784828f5cb0e1798acec24eeead" gracePeriod=10 Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.679300 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78090383-28da-4042-affd-d5324d62bef4-config\") pod \"dnsmasq-dns-57dc6dbc4c-jm42x\" (UID: \"78090383-28da-4042-affd-d5324d62bef4\") " pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.679368 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlnjf\" (UniqueName: \"kubernetes.io/projected/78090383-28da-4042-affd-d5324d62bef4-kube-api-access-nlnjf\") pod \"dnsmasq-dns-57dc6dbc4c-jm42x\" (UID: \"78090383-28da-4042-affd-d5324d62bef4\") " pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.679428 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78090383-28da-4042-affd-d5324d62bef4-dns-svc\") pod \"dnsmasq-dns-57dc6dbc4c-jm42x\" (UID: \"78090383-28da-4042-affd-d5324d62bef4\") " pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.680246 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78090383-28da-4042-affd-d5324d62bef4-dns-svc\") pod \"dnsmasq-dns-57dc6dbc4c-jm42x\" (UID: \"78090383-28da-4042-affd-d5324d62bef4\") " pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.680328 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78090383-28da-4042-affd-d5324d62bef4-config\") pod \"dnsmasq-dns-57dc6dbc4c-jm42x\" (UID: \"78090383-28da-4042-affd-d5324d62bef4\") " pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.699129 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-644597f84c-46l7c" podStartSLOduration=-9223371999.15567 podStartE2EDuration="37.699105636s" podCreationTimestamp="2025-10-08 22:40:38 +0000 UTC" firstStartedPulling="2025-10-08 22:40:39.651698755 +0000 UTC m=+1047.474583511" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:41:15.693351156 +0000 UTC m=+1083.516235902" watchObservedRunningTime="2025-10-08 22:41:15.699105636 +0000 UTC m=+1083.521990382" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.701486 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlnjf\" (UniqueName: \"kubernetes.io/projected/78090383-28da-4042-affd-d5324d62bef4-kube-api-access-nlnjf\") pod \"dnsmasq-dns-57dc6dbc4c-jm42x\" (UID: \"78090383-28da-4042-affd-d5324d62bef4\") " pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" Oct 08 22:41:15 crc kubenswrapper[4834]: I1008 22:41:15.836947 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.667510 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.673096 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.674986 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.676032 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-4lzq9" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.676213 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.676339 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.691476 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.702705 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6ab95611-95ff-46bf-9b06-2ed44a58fa46-cache\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.702753 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbhrb\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-kube-api-access-dbhrb\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.702843 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.702864 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6ab95611-95ff-46bf-9b06-2ed44a58fa46-lock\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.702880 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.719746 4834 generic.go:334] "Generic (PLEG): container finished" podID="44e528c2-5eb9-465e-8df9-012865d20ced" containerID="a188889565e6b65f10154bc8da82638cedd8d784828f5cb0e1798acec24eeead" exitCode=0 Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.720112 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-46l7c" event={"ID":"44e528c2-5eb9-465e-8df9-012865d20ced","Type":"ContainerDied","Data":"a188889565e6b65f10154bc8da82638cedd8d784828f5cb0e1798acec24eeead"} Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.720187 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-46l7c" event={"ID":"44e528c2-5eb9-465e-8df9-012865d20ced","Type":"ContainerDied","Data":"e5da390df93eb02730afa36baf815a888c6844ae5c5dec8bf3ec0f6d3e1a7e86"} Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.720203 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5da390df93eb02730afa36baf815a888c6844ae5c5dec8bf3ec0f6d3e1a7e86" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.723241 4834 generic.go:334] "Generic (PLEG): container finished" podID="894c1f04-42d4-43de-a34a-19200ceec426" containerID="76243ea0ef140a340cb63efe65ec59c7b2f1673fd849f87f10bcff1ba8e296fa" exitCode=0 Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.723308 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"894c1f04-42d4-43de-a34a-19200ceec426","Type":"ContainerDied","Data":"76243ea0ef140a340cb63efe65ec59c7b2f1673fd849f87f10bcff1ba8e296fa"} Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.772439 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-46l7c" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.805366 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6ab95611-95ff-46bf-9b06-2ed44a58fa46-cache\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.805438 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbhrb\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-kube-api-access-dbhrb\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.805557 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.805578 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6ab95611-95ff-46bf-9b06-2ed44a58fa46-lock\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.805600 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.805996 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.808497 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6ab95611-95ff-46bf-9b06-2ed44a58fa46-cache\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: E1008 22:41:16.808645 4834 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 22:41:16 crc kubenswrapper[4834]: E1008 22:41:16.808670 4834 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 22:41:16 crc kubenswrapper[4834]: E1008 22:41:16.808722 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift podName:6ab95611-95ff-46bf-9b06-2ed44a58fa46 nodeName:}" failed. No retries permitted until 2025-10-08 22:41:17.308698958 +0000 UTC m=+1085.131583694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift") pod "swift-storage-0" (UID: "6ab95611-95ff-46bf-9b06-2ed44a58fa46") : configmap "swift-ring-files" not found Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.809081 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6ab95611-95ff-46bf-9b06-2ed44a58fa46-lock\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.827035 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbhrb\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-kube-api-access-dbhrb\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.839329 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.907196 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44e528c2-5eb9-465e-8df9-012865d20ced-config\") pod \"44e528c2-5eb9-465e-8df9-012865d20ced\" (UID: \"44e528c2-5eb9-465e-8df9-012865d20ced\") " Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.907651 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n74d6\" (UniqueName: \"kubernetes.io/projected/44e528c2-5eb9-465e-8df9-012865d20ced-kube-api-access-n74d6\") pod \"44e528c2-5eb9-465e-8df9-012865d20ced\" (UID: \"44e528c2-5eb9-465e-8df9-012865d20ced\") " Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.907873 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44e528c2-5eb9-465e-8df9-012865d20ced-dns-svc\") pod \"44e528c2-5eb9-465e-8df9-012865d20ced\" (UID: \"44e528c2-5eb9-465e-8df9-012865d20ced\") " Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.913397 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44e528c2-5eb9-465e-8df9-012865d20ced-kube-api-access-n74d6" (OuterVolumeSpecName: "kube-api-access-n74d6") pod "44e528c2-5eb9-465e-8df9-012865d20ced" (UID: "44e528c2-5eb9-465e-8df9-012865d20ced"). InnerVolumeSpecName "kube-api-access-n74d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.949232 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44e528c2-5eb9-465e-8df9-012865d20ced-config" (OuterVolumeSpecName: "config") pod "44e528c2-5eb9-465e-8df9-012865d20ced" (UID: "44e528c2-5eb9-465e-8df9-012865d20ced"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:41:16 crc kubenswrapper[4834]: I1008 22:41:16.951876 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44e528c2-5eb9-465e-8df9-012865d20ced-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "44e528c2-5eb9-465e-8df9-012865d20ced" (UID: "44e528c2-5eb9-465e-8df9-012865d20ced"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.010694 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44e528c2-5eb9-465e-8df9-012865d20ced-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.010734 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n74d6\" (UniqueName: \"kubernetes.io/projected/44e528c2-5eb9-465e-8df9-012865d20ced-kube-api-access-n74d6\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.010751 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/44e528c2-5eb9-465e-8df9-012865d20ced-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.230333 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2cn89"] Oct 08 22:41:17 crc kubenswrapper[4834]: E1008 22:41:17.231091 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e528c2-5eb9-465e-8df9-012865d20ced" containerName="init" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.231109 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e528c2-5eb9-465e-8df9-012865d20ced" containerName="init" Oct 08 22:41:17 crc kubenswrapper[4834]: E1008 22:41:17.231167 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e528c2-5eb9-465e-8df9-012865d20ced" containerName="dnsmasq-dns" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.231177 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e528c2-5eb9-465e-8df9-012865d20ced" containerName="dnsmasq-dns" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.231352 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="44e528c2-5eb9-465e-8df9-012865d20ced" containerName="dnsmasq-dns" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.236365 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.239759 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.240003 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.240166 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.278796 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2cn89"] Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.319005 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-ring-data-devices\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.319118 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-scripts\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.319159 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-combined-ca-bundle\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.319209 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-etc-swift\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.319237 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-dispersionconf\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.319290 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6ncv\" (UniqueName: \"kubernetes.io/projected/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-kube-api-access-l6ncv\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.319338 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.319396 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-swiftconf\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: E1008 22:41:17.319667 4834 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 22:41:17 crc kubenswrapper[4834]: E1008 22:41:17.319760 4834 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 22:41:17 crc kubenswrapper[4834]: E1008 22:41:17.319824 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift podName:6ab95611-95ff-46bf-9b06-2ed44a58fa46 nodeName:}" failed. No retries permitted until 2025-10-08 22:41:18.319802144 +0000 UTC m=+1086.142686890 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift") pod "swift-storage-0" (UID: "6ab95611-95ff-46bf-9b06-2ed44a58fa46") : configmap "swift-ring-files" not found Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.420855 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-etc-swift\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.420907 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-dispersionconf\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.420936 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6ncv\" (UniqueName: \"kubernetes.io/projected/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-kube-api-access-l6ncv\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.420984 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-swiftconf\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.421012 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-ring-data-devices\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.421056 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-scripts\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.421080 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-combined-ca-bundle\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.422040 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-etc-swift\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.422966 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57dc6dbc4c-jm42x"] Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.423201 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-ring-data-devices\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.423552 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-scripts\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.426928 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-combined-ca-bundle\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.426942 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-swiftconf\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.427809 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-dispersionconf\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: W1008 22:41:17.428258 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78090383_28da_4042_affd_d5324d62bef4.slice/crio-89dcd9b6a76955084d5d80d8d2c1d6794b84b5bd44ce6ebf237d65b95694c5fc WatchSource:0}: Error finding container 89dcd9b6a76955084d5d80d8d2c1d6794b84b5bd44ce6ebf237d65b95694c5fc: Status 404 returned error can't find the container with id 89dcd9b6a76955084d5d80d8d2c1d6794b84b5bd44ce6ebf237d65b95694c5fc Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.438178 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6ncv\" (UniqueName: \"kubernetes.io/projected/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-kube-api-access-l6ncv\") pod \"swift-ring-rebalance-2cn89\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.601692 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.731432 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"894c1f04-42d4-43de-a34a-19200ceec426","Type":"ContainerStarted","Data":"d646e784de1ed98c37c28ffe3ae0b7e58f0fe6b6d695716ac9a7d9f2e16a50a7"} Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.740586 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda","Type":"ContainerStarted","Data":"28c0d0fa60a03d4e0502734c048a91f1f1485f354314d1d0e3b140ec8efc322e"} Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.751059 4834 generic.go:334] "Generic (PLEG): container finished" podID="78090383-28da-4042-affd-d5324d62bef4" containerID="fd31af493cd7ccb9d63a55bae9509046bd9ac2763a82a6f3ebe712157523dfe4" exitCode=0 Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.751182 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" event={"ID":"78090383-28da-4042-affd-d5324d62bef4","Type":"ContainerDied","Data":"fd31af493cd7ccb9d63a55bae9509046bd9ac2763a82a6f3ebe712157523dfe4"} Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.751210 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" event={"ID":"78090383-28da-4042-affd-d5324d62bef4","Type":"ContainerStarted","Data":"89dcd9b6a76955084d5d80d8d2c1d6794b84b5bd44ce6ebf237d65b95694c5fc"} Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.753966 4834 generic.go:334] "Generic (PLEG): container finished" podID="34aacb58-3b8d-466d-9b71-e7098b95fe8e" containerID="e01a7e96d33013eec69b7d341daec852f82c75bfd87353e3cb99da288feb48fd" exitCode=0 Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.754039 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"34aacb58-3b8d-466d-9b71-e7098b95fe8e","Type":"ContainerDied","Data":"e01a7e96d33013eec69b7d341daec852f82c75bfd87353e3cb99da288feb48fd"} Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.765415 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-46l7c" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.766922 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"701b75e6-1acc-47d0-85de-2349a6345a3b","Type":"ContainerStarted","Data":"e630c1079c525a16879ee972cc2c3b32e171adbd6c3917c5714b8770364bfc76"} Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.772163 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=25.418416949 podStartE2EDuration="36.772112529s" podCreationTimestamp="2025-10-08 22:40:41 +0000 UTC" firstStartedPulling="2025-10-08 22:40:59.590921544 +0000 UTC m=+1067.413806290" lastFinishedPulling="2025-10-08 22:41:10.944617064 +0000 UTC m=+1078.767501870" observedRunningTime="2025-10-08 22:41:17.760882946 +0000 UTC m=+1085.583767692" watchObservedRunningTime="2025-10-08 22:41:17.772112529 +0000 UTC m=+1085.594997275" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.789396 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=12.232976699 podStartE2EDuration="29.789321658s" podCreationTimestamp="2025-10-08 22:40:48 +0000 UTC" firstStartedPulling="2025-10-08 22:40:59.597404031 +0000 UTC m=+1067.420288797" lastFinishedPulling="2025-10-08 22:41:17.15374901 +0000 UTC m=+1084.976633756" observedRunningTime="2025-10-08 22:41:17.78566779 +0000 UTC m=+1085.608552536" watchObservedRunningTime="2025-10-08 22:41:17.789321658 +0000 UTC m=+1085.612206404" Oct 08 22:41:17 crc kubenswrapper[4834]: W1008 22:41:17.863724 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbedf43f2_17b7_462d_8ca1_41d4dae1e6cb.slice/crio-ded69cf5b3f39635560c0af743553280f753d8d4ba461c3759bada094e6bcd55 WatchSource:0}: Error finding container ded69cf5b3f39635560c0af743553280f753d8d4ba461c3759bada094e6bcd55: Status 404 returned error can't find the container with id ded69cf5b3f39635560c0af743553280f753d8d4ba461c3759bada094e6bcd55 Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.865175 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2cn89"] Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.885838 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=9.354460848 podStartE2EDuration="26.885808458s" podCreationTimestamp="2025-10-08 22:40:51 +0000 UTC" firstStartedPulling="2025-10-08 22:40:59.592880811 +0000 UTC m=+1067.415765567" lastFinishedPulling="2025-10-08 22:41:17.124228431 +0000 UTC m=+1084.947113177" observedRunningTime="2025-10-08 22:41:17.840806962 +0000 UTC m=+1085.663691728" watchObservedRunningTime="2025-10-08 22:41:17.885808458 +0000 UTC m=+1085.708693224" Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.898157 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-46l7c"] Oct 08 22:41:17 crc kubenswrapper[4834]: I1008 22:41:17.912357 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-46l7c"] Oct 08 22:41:18 crc kubenswrapper[4834]: I1008 22:41:18.336605 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:18 crc kubenswrapper[4834]: E1008 22:41:18.336804 4834 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 22:41:18 crc kubenswrapper[4834]: E1008 22:41:18.336831 4834 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 22:41:18 crc kubenswrapper[4834]: E1008 22:41:18.336893 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift podName:6ab95611-95ff-46bf-9b06-2ed44a58fa46 nodeName:}" failed. No retries permitted until 2025-10-08 22:41:20.336875282 +0000 UTC m=+1088.159760028 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift") pod "swift-storage-0" (UID: "6ab95611-95ff-46bf-9b06-2ed44a58fa46") : configmap "swift-ring-files" not found Oct 08 22:41:18 crc kubenswrapper[4834]: I1008 22:41:18.776709 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2cn89" event={"ID":"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb","Type":"ContainerStarted","Data":"ded69cf5b3f39635560c0af743553280f753d8d4ba461c3759bada094e6bcd55"} Oct 08 22:41:18 crc kubenswrapper[4834]: I1008 22:41:18.779890 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" event={"ID":"78090383-28da-4042-affd-d5324d62bef4","Type":"ContainerStarted","Data":"06d616954cfd258b4079042f211a2cdef2a856cd5cae4fdece4b8637fd61dd70"} Oct 08 22:41:18 crc kubenswrapper[4834]: I1008 22:41:18.780237 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" Oct 08 22:41:18 crc kubenswrapper[4834]: I1008 22:41:18.783759 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"34aacb58-3b8d-466d-9b71-e7098b95fe8e","Type":"ContainerStarted","Data":"3c24af656d20cb96d210268f8f068f5cf9e967d712c1e772384d7063e6db6c03"} Oct 08 22:41:18 crc kubenswrapper[4834]: I1008 22:41:18.799772 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" podStartSLOduration=3.799750215 podStartE2EDuration="3.799750215s" podCreationTimestamp="2025-10-08 22:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:41:18.793887072 +0000 UTC m=+1086.616771828" watchObservedRunningTime="2025-10-08 22:41:18.799750215 +0000 UTC m=+1086.622634961" Oct 08 22:41:18 crc kubenswrapper[4834]: I1008 22:41:18.818708 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.176989832 podStartE2EDuration="37.818690945s" podCreationTimestamp="2025-10-08 22:40:41 +0000 UTC" firstStartedPulling="2025-10-08 22:40:59.604523575 +0000 UTC m=+1067.427408321" lastFinishedPulling="2025-10-08 22:41:11.246224658 +0000 UTC m=+1079.069109434" observedRunningTime="2025-10-08 22:41:18.813622903 +0000 UTC m=+1086.636507649" watchObservedRunningTime="2025-10-08 22:41:18.818690945 +0000 UTC m=+1086.641575691" Oct 08 22:41:19 crc kubenswrapper[4834]: I1008 22:41:19.415370 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:41:19 crc kubenswrapper[4834]: I1008 22:41:19.521448 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 08 22:41:19 crc kubenswrapper[4834]: I1008 22:41:19.569249 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44e528c2-5eb9-465e-8df9-012865d20ced" path="/var/lib/kubelet/pods/44e528c2-5eb9-465e-8df9-012865d20ced/volumes" Oct 08 22:41:19 crc kubenswrapper[4834]: I1008 22:41:19.576804 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 08 22:41:19 crc kubenswrapper[4834]: I1008 22:41:19.796606 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 08 22:41:19 crc kubenswrapper[4834]: I1008 22:41:19.861528 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.090683 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.091198 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.131111 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57dc6dbc4c-jm42x"] Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.153624 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fcb8575d9-sjqmx"] Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.156240 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.159909 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.170825 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fcb8575d9-sjqmx"] Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.189979 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-698pm\" (UniqueName: \"kubernetes.io/projected/9e0d0702-07d0-4291-9683-6edadee6d8d8-kube-api-access-698pm\") pod \"dnsmasq-dns-5fcb8575d9-sjqmx\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.190030 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-config\") pod \"dnsmasq-dns-5fcb8575d9-sjqmx\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.190189 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-ovsdbserver-sb\") pod \"dnsmasq-dns-5fcb8575d9-sjqmx\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.190302 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-dns-svc\") pod \"dnsmasq-dns-5fcb8575d9-sjqmx\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.208238 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.236843 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-l24rp"] Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.238506 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.243870 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.258228 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-l24rp"] Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.298069 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.298351 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-dns-svc\") pod \"dnsmasq-dns-5fcb8575d9-sjqmx\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.298420 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-combined-ca-bundle\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.298515 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-config\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.298616 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-698pm\" (UniqueName: \"kubernetes.io/projected/9e0d0702-07d0-4291-9683-6edadee6d8d8-kube-api-access-698pm\") pod \"dnsmasq-dns-5fcb8575d9-sjqmx\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.298644 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-config\") pod \"dnsmasq-dns-5fcb8575d9-sjqmx\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.298700 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-ovn-rundir\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.298776 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-ovs-rundir\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.298875 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vntm4\" (UniqueName: \"kubernetes.io/projected/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-kube-api-access-vntm4\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.298966 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-ovsdbserver-sb\") pod \"dnsmasq-dns-5fcb8575d9-sjqmx\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.299357 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-dns-svc\") pod \"dnsmasq-dns-5fcb8575d9-sjqmx\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.300025 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-ovsdbserver-sb\") pod \"dnsmasq-dns-5fcb8575d9-sjqmx\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.300344 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-config\") pod \"dnsmasq-dns-5fcb8575d9-sjqmx\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.327805 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-698pm\" (UniqueName: \"kubernetes.io/projected/9e0d0702-07d0-4291-9683-6edadee6d8d8-kube-api-access-698pm\") pod \"dnsmasq-dns-5fcb8575d9-sjqmx\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.402376 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-combined-ca-bundle\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.402973 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-config\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.403015 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-ovn-rundir\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.403049 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-ovs-rundir\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.403076 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.403101 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vntm4\" (UniqueName: \"kubernetes.io/projected/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-kube-api-access-vntm4\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.403176 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: E1008 22:41:20.404062 4834 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 22:41:20 crc kubenswrapper[4834]: E1008 22:41:20.404094 4834 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 22:41:20 crc kubenswrapper[4834]: E1008 22:41:20.404194 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift podName:6ab95611-95ff-46bf-9b06-2ed44a58fa46 nodeName:}" failed. No retries permitted until 2025-10-08 22:41:24.404174426 +0000 UTC m=+1092.227059172 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift") pod "swift-storage-0" (UID: "6ab95611-95ff-46bf-9b06-2ed44a58fa46") : configmap "swift-ring-files" not found Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.404198 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-ovs-rundir\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.404352 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-ovn-rundir\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.406311 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-config\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.406424 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.412745 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-combined-ca-bundle\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.419643 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fcb8575d9-sjqmx"] Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.420885 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vntm4\" (UniqueName: \"kubernetes.io/projected/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-kube-api-access-vntm4\") pod \"ovn-controller-metrics-l24rp\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.422894 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.452075 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57f58c7cff-5bxfv"] Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.453792 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.459116 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.467819 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57f58c7cff-5bxfv"] Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.504357 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-ovsdbserver-nb\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.504419 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txgkl\" (UniqueName: \"kubernetes.io/projected/361e1c9f-8765-484b-a2ef-4b2e3db1af99-kube-api-access-txgkl\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.504442 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-config\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.504481 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-dns-svc\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.504528 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-ovsdbserver-sb\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.564064 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.606857 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txgkl\" (UniqueName: \"kubernetes.io/projected/361e1c9f-8765-484b-a2ef-4b2e3db1af99-kube-api-access-txgkl\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.606928 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-config\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.606991 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-dns-svc\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.607080 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-ovsdbserver-sb\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.607184 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-ovsdbserver-nb\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.608190 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-config\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.608236 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-ovsdbserver-sb\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.608705 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-ovsdbserver-nb\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.608769 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-dns-svc\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.626187 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txgkl\" (UniqueName: \"kubernetes.io/projected/361e1c9f-8765-484b-a2ef-4b2e3db1af99-kube-api-access-txgkl\") pod \"dnsmasq-dns-57f58c7cff-5bxfv\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.793300 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.803523 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" podUID="78090383-28da-4042-affd-d5324d62bef4" containerName="dnsmasq-dns" containerID="cri-o://06d616954cfd258b4079042f211a2cdef2a856cd5cae4fdece4b8637fd61dd70" gracePeriod=10 Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.853936 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.992859 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 08 22:41:20 crc kubenswrapper[4834]: I1008 22:41:20.997249 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:20.999676 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.000198 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-mlldf" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.000375 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.005655 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.012108 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.015271 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00e05134-e159-40fe-9c63-a0dc406c8dee-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.015320 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.015356 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.015496 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e05134-e159-40fe-9c63-a0dc406c8dee-scripts\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.015592 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkm97\" (UniqueName: \"kubernetes.io/projected/00e05134-e159-40fe-9c63-a0dc406c8dee-kube-api-access-pkm97\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.015692 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.015766 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00e05134-e159-40fe-9c63-a0dc406c8dee-config\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.119061 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00e05134-e159-40fe-9c63-a0dc406c8dee-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.119113 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.119166 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.119186 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e05134-e159-40fe-9c63-a0dc406c8dee-scripts\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.119213 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkm97\" (UniqueName: \"kubernetes.io/projected/00e05134-e159-40fe-9c63-a0dc406c8dee-kube-api-access-pkm97\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.119249 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.119280 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00e05134-e159-40fe-9c63-a0dc406c8dee-config\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.120071 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e05134-e159-40fe-9c63-a0dc406c8dee-scripts\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.120298 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00e05134-e159-40fe-9c63-a0dc406c8dee-config\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.120723 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00e05134-e159-40fe-9c63-a0dc406c8dee-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.124425 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.125013 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.132848 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.135547 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkm97\" (UniqueName: \"kubernetes.io/projected/00e05134-e159-40fe-9c63-a0dc406c8dee-kube-api-access-pkm97\") pod \"ovn-northd-0\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.316246 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.824397 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.831055 4834 generic.go:334] "Generic (PLEG): container finished" podID="78090383-28da-4042-affd-d5324d62bef4" containerID="06d616954cfd258b4079042f211a2cdef2a856cd5cae4fdece4b8637fd61dd70" exitCode=0 Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.831110 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" event={"ID":"78090383-28da-4042-affd-d5324d62bef4","Type":"ContainerDied","Data":"06d616954cfd258b4079042f211a2cdef2a856cd5cae4fdece4b8637fd61dd70"} Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.831160 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" event={"ID":"78090383-28da-4042-affd-d5324d62bef4","Type":"ContainerDied","Data":"89dcd9b6a76955084d5d80d8d2c1d6794b84b5bd44ce6ebf237d65b95694c5fc"} Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.831182 4834 scope.go:117] "RemoveContainer" containerID="06d616954cfd258b4079042f211a2cdef2a856cd5cae4fdece4b8637fd61dd70" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.867064 4834 scope.go:117] "RemoveContainer" containerID="fd31af493cd7ccb9d63a55bae9509046bd9ac2763a82a6f3ebe712157523dfe4" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.905813 4834 scope.go:117] "RemoveContainer" containerID="06d616954cfd258b4079042f211a2cdef2a856cd5cae4fdece4b8637fd61dd70" Oct 08 22:41:21 crc kubenswrapper[4834]: E1008 22:41:21.906327 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06d616954cfd258b4079042f211a2cdef2a856cd5cae4fdece4b8637fd61dd70\": container with ID starting with 06d616954cfd258b4079042f211a2cdef2a856cd5cae4fdece4b8637fd61dd70 not found: ID does not exist" containerID="06d616954cfd258b4079042f211a2cdef2a856cd5cae4fdece4b8637fd61dd70" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.906390 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06d616954cfd258b4079042f211a2cdef2a856cd5cae4fdece4b8637fd61dd70"} err="failed to get container status \"06d616954cfd258b4079042f211a2cdef2a856cd5cae4fdece4b8637fd61dd70\": rpc error: code = NotFound desc = could not find container \"06d616954cfd258b4079042f211a2cdef2a856cd5cae4fdece4b8637fd61dd70\": container with ID starting with 06d616954cfd258b4079042f211a2cdef2a856cd5cae4fdece4b8637fd61dd70 not found: ID does not exist" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.906618 4834 scope.go:117] "RemoveContainer" containerID="fd31af493cd7ccb9d63a55bae9509046bd9ac2763a82a6f3ebe712157523dfe4" Oct 08 22:41:21 crc kubenswrapper[4834]: E1008 22:41:21.907402 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd31af493cd7ccb9d63a55bae9509046bd9ac2763a82a6f3ebe712157523dfe4\": container with ID starting with fd31af493cd7ccb9d63a55bae9509046bd9ac2763a82a6f3ebe712157523dfe4 not found: ID does not exist" containerID="fd31af493cd7ccb9d63a55bae9509046bd9ac2763a82a6f3ebe712157523dfe4" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.907441 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd31af493cd7ccb9d63a55bae9509046bd9ac2763a82a6f3ebe712157523dfe4"} err="failed to get container status \"fd31af493cd7ccb9d63a55bae9509046bd9ac2763a82a6f3ebe712157523dfe4\": rpc error: code = NotFound desc = could not find container \"fd31af493cd7ccb9d63a55bae9509046bd9ac2763a82a6f3ebe712157523dfe4\": container with ID starting with fd31af493cd7ccb9d63a55bae9509046bd9ac2763a82a6f3ebe712157523dfe4 not found: ID does not exist" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.935722 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlnjf\" (UniqueName: \"kubernetes.io/projected/78090383-28da-4042-affd-d5324d62bef4-kube-api-access-nlnjf\") pod \"78090383-28da-4042-affd-d5324d62bef4\" (UID: \"78090383-28da-4042-affd-d5324d62bef4\") " Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.935853 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78090383-28da-4042-affd-d5324d62bef4-config\") pod \"78090383-28da-4042-affd-d5324d62bef4\" (UID: \"78090383-28da-4042-affd-d5324d62bef4\") " Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.935904 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78090383-28da-4042-affd-d5324d62bef4-dns-svc\") pod \"78090383-28da-4042-affd-d5324d62bef4\" (UID: \"78090383-28da-4042-affd-d5324d62bef4\") " Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.947996 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78090383-28da-4042-affd-d5324d62bef4-kube-api-access-nlnjf" (OuterVolumeSpecName: "kube-api-access-nlnjf") pod "78090383-28da-4042-affd-d5324d62bef4" (UID: "78090383-28da-4042-affd-d5324d62bef4"). InnerVolumeSpecName "kube-api-access-nlnjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.979193 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78090383-28da-4042-affd-d5324d62bef4-config" (OuterVolumeSpecName: "config") pod "78090383-28da-4042-affd-d5324d62bef4" (UID: "78090383-28da-4042-affd-d5324d62bef4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:41:21 crc kubenswrapper[4834]: I1008 22:41:21.983766 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78090383-28da-4042-affd-d5324d62bef4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "78090383-28da-4042-affd-d5324d62bef4" (UID: "78090383-28da-4042-affd-d5324d62bef4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.038419 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78090383-28da-4042-affd-d5324d62bef4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.038471 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlnjf\" (UniqueName: \"kubernetes.io/projected/78090383-28da-4042-affd-d5324d62bef4-kube-api-access-nlnjf\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.038486 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78090383-28da-4042-affd-d5324d62bef4-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.107710 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.120302 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-l24rp"] Oct 08 22:41:22 crc kubenswrapper[4834]: W1008 22:41:22.129683 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9087d728_8ea1_4f0c_aff6_7dae2fd139ec.slice/crio-7e8e0b124ac904f0c5eb8e8cff7c9358eec39f02813ce3062ef5855349622ca9 WatchSource:0}: Error finding container 7e8e0b124ac904f0c5eb8e8cff7c9358eec39f02813ce3062ef5855349622ca9: Status 404 returned error can't find the container with id 7e8e0b124ac904f0c5eb8e8cff7c9358eec39f02813ce3062ef5855349622ca9 Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.164756 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57f58c7cff-5bxfv"] Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.235330 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fcb8575d9-sjqmx"] Oct 08 22:41:22 crc kubenswrapper[4834]: W1008 22:41:22.247282 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e0d0702_07d0_4291_9683_6edadee6d8d8.slice/crio-8b494a50d9027de081d3eecdca5a8eb8e7bba2b2f3d65856cfa8f67342a5cf73 WatchSource:0}: Error finding container 8b494a50d9027de081d3eecdca5a8eb8e7bba2b2f3d65856cfa8f67342a5cf73: Status 404 returned error can't find the container with id 8b494a50d9027de081d3eecdca5a8eb8e7bba2b2f3d65856cfa8f67342a5cf73 Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.845675 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l24rp" event={"ID":"9087d728-8ea1-4f0c-aff6-7dae2fd139ec","Type":"ContainerStarted","Data":"cf8764a03b4bf1c3a07b250cdaacaf2edfad20da4aa53f6789acb7d9ee72de4d"} Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.846589 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l24rp" event={"ID":"9087d728-8ea1-4f0c-aff6-7dae2fd139ec","Type":"ContainerStarted","Data":"7e8e0b124ac904f0c5eb8e8cff7c9358eec39f02813ce3062ef5855349622ca9"} Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.850415 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00e05134-e159-40fe-9c63-a0dc406c8dee","Type":"ContainerStarted","Data":"ebc156307ae1e0f44c84bb59a1cd9048a78bdbb163f2d1a956d0ff4a051a33c8"} Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.853609 4834 generic.go:334] "Generic (PLEG): container finished" podID="361e1c9f-8765-484b-a2ef-4b2e3db1af99" containerID="21726a9a88b68197f50bd3071c644dee6314754cdcf9f7b3c93e80adfa418676" exitCode=0 Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.853729 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" event={"ID":"361e1c9f-8765-484b-a2ef-4b2e3db1af99","Type":"ContainerDied","Data":"21726a9a88b68197f50bd3071c644dee6314754cdcf9f7b3c93e80adfa418676"} Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.853776 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" event={"ID":"361e1c9f-8765-484b-a2ef-4b2e3db1af99","Type":"ContainerStarted","Data":"aae7a6c13a1b60aea04e210d940817f4ad395c8f5fdb11d5304714178acf8442"} Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.856255 4834 generic.go:334] "Generic (PLEG): container finished" podID="9e0d0702-07d0-4291-9683-6edadee6d8d8" containerID="f6a3c4a1062b166ed39a481cf4f59bb19fc9d1d3d1fb4dc24ada31cd491d6178" exitCode=0 Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.856366 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" event={"ID":"9e0d0702-07d0-4291-9683-6edadee6d8d8","Type":"ContainerDied","Data":"f6a3c4a1062b166ed39a481cf4f59bb19fc9d1d3d1fb4dc24ada31cd491d6178"} Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.856487 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" event={"ID":"9e0d0702-07d0-4291-9683-6edadee6d8d8","Type":"ContainerStarted","Data":"8b494a50d9027de081d3eecdca5a8eb8e7bba2b2f3d65856cfa8f67342a5cf73"} Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.860503 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2cn89" event={"ID":"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb","Type":"ContainerStarted","Data":"d868a94140ac15e6896c96857c31093ceddf5d4d44594dedeb5f93723c331d7d"} Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.873654 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57dc6dbc4c-jm42x" Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.917591 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-2cn89" podStartSLOduration=2.124271937 podStartE2EDuration="5.917567553s" podCreationTimestamp="2025-10-08 22:41:17 +0000 UTC" firstStartedPulling="2025-10-08 22:41:17.865700958 +0000 UTC m=+1085.688585724" lastFinishedPulling="2025-10-08 22:41:21.658996584 +0000 UTC m=+1089.481881340" observedRunningTime="2025-10-08 22:41:22.901225185 +0000 UTC m=+1090.724109941" watchObservedRunningTime="2025-10-08 22:41:22.917567553 +0000 UTC m=+1090.740452289" Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.917701 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-l24rp" podStartSLOduration=2.917695316 podStartE2EDuration="2.917695316s" podCreationTimestamp="2025-10-08 22:41:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:41:22.872208228 +0000 UTC m=+1090.695092984" watchObservedRunningTime="2025-10-08 22:41:22.917695316 +0000 UTC m=+1090.740580062" Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.979043 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57dc6dbc4c-jm42x"] Oct 08 22:41:22 crc kubenswrapper[4834]: I1008 22:41:22.988366 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57dc6dbc4c-jm42x"] Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.210458 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.212327 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.370305 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.370359 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.418376 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.602655 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78090383-28da-4042-affd-d5324d62bef4" path="/var/lib/kubelet/pods/78090383-28da-4042-affd-d5324d62bef4/volumes" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.645168 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.776277 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-config\") pod \"9e0d0702-07d0-4291-9683-6edadee6d8d8\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.776566 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-dns-svc\") pod \"9e0d0702-07d0-4291-9683-6edadee6d8d8\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.776751 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-ovsdbserver-sb\") pod \"9e0d0702-07d0-4291-9683-6edadee6d8d8\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.776853 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-698pm\" (UniqueName: \"kubernetes.io/projected/9e0d0702-07d0-4291-9683-6edadee6d8d8-kube-api-access-698pm\") pod \"9e0d0702-07d0-4291-9683-6edadee6d8d8\" (UID: \"9e0d0702-07d0-4291-9683-6edadee6d8d8\") " Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.781885 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e0d0702-07d0-4291-9683-6edadee6d8d8-kube-api-access-698pm" (OuterVolumeSpecName: "kube-api-access-698pm") pod "9e0d0702-07d0-4291-9683-6edadee6d8d8" (UID: "9e0d0702-07d0-4291-9683-6edadee6d8d8"). InnerVolumeSpecName "kube-api-access-698pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.809094 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-config" (OuterVolumeSpecName: "config") pod "9e0d0702-07d0-4291-9683-6edadee6d8d8" (UID: "9e0d0702-07d0-4291-9683-6edadee6d8d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.809101 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e0d0702-07d0-4291-9683-6edadee6d8d8" (UID: "9e0d0702-07d0-4291-9683-6edadee6d8d8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.819619 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9e0d0702-07d0-4291-9683-6edadee6d8d8" (UID: "9e0d0702-07d0-4291-9683-6edadee6d8d8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.879987 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.880019 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.880027 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e0d0702-07d0-4291-9683-6edadee6d8d8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.880036 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-698pm\" (UniqueName: \"kubernetes.io/projected/9e0d0702-07d0-4291-9683-6edadee6d8d8-kube-api-access-698pm\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.883268 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00e05134-e159-40fe-9c63-a0dc406c8dee","Type":"ContainerStarted","Data":"befa7386c77d07b3be61cbc85442566df26dcee9bc664cf8da1c08dd1f7c92d7"} Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.886201 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" event={"ID":"361e1c9f-8765-484b-a2ef-4b2e3db1af99","Type":"ContainerStarted","Data":"f0b6250ea0aa62689a7cecc3a1672706ca1cd0027b1e3877b2292f8da2d8ab95"} Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.886386 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.889013 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" event={"ID":"9e0d0702-07d0-4291-9683-6edadee6d8d8","Type":"ContainerDied","Data":"8b494a50d9027de081d3eecdca5a8eb8e7bba2b2f3d65856cfa8f67342a5cf73"} Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.889048 4834 scope.go:117] "RemoveContainer" containerID="f6a3c4a1062b166ed39a481cf4f59bb19fc9d1d3d1fb4dc24ada31cd491d6178" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.889424 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fcb8575d9-sjqmx" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.920668 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" podStartSLOduration=3.9206434 podStartE2EDuration="3.9206434s" podCreationTimestamp="2025-10-08 22:41:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:41:23.920351843 +0000 UTC m=+1091.743236599" watchObservedRunningTime="2025-10-08 22:41:23.9206434 +0000 UTC m=+1091.743528146" Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.960433 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fcb8575d9-sjqmx"] Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.965026 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fcb8575d9-sjqmx"] Oct 08 22:41:23 crc kubenswrapper[4834]: I1008 22:41:23.973509 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 08 22:41:24 crc kubenswrapper[4834]: I1008 22:41:24.490937 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:24 crc kubenswrapper[4834]: E1008 22:41:24.491733 4834 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 08 22:41:24 crc kubenswrapper[4834]: E1008 22:41:24.491886 4834 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 08 22:41:24 crc kubenswrapper[4834]: E1008 22:41:24.491950 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift podName:6ab95611-95ff-46bf-9b06-2ed44a58fa46 nodeName:}" failed. No retries permitted until 2025-10-08 22:41:32.491929402 +0000 UTC m=+1100.314814178 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift") pod "swift-storage-0" (UID: "6ab95611-95ff-46bf-9b06-2ed44a58fa46") : configmap "swift-ring-files" not found Oct 08 22:41:24 crc kubenswrapper[4834]: I1008 22:41:24.902421 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00e05134-e159-40fe-9c63-a0dc406c8dee","Type":"ContainerStarted","Data":"ee6702ece47fd3dad3f711249016d49520a142737dfe65f64635bcd1579089db"} Oct 08 22:41:24 crc kubenswrapper[4834]: I1008 22:41:24.902950 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 08 22:41:24 crc kubenswrapper[4834]: I1008 22:41:24.927825 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.5443120070000003 podStartE2EDuration="4.927810838s" podCreationTimestamp="2025-10-08 22:41:20 +0000 UTC" firstStartedPulling="2025-10-08 22:41:22.120557335 +0000 UTC m=+1089.943442081" lastFinishedPulling="2025-10-08 22:41:23.504056166 +0000 UTC m=+1091.326940912" observedRunningTime="2025-10-08 22:41:24.924985229 +0000 UTC m=+1092.747869985" watchObservedRunningTime="2025-10-08 22:41:24.927810838 +0000 UTC m=+1092.750695594" Oct 08 22:41:25 crc kubenswrapper[4834]: I1008 22:41:25.344228 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 08 22:41:25 crc kubenswrapper[4834]: I1008 22:41:25.395089 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 08 22:41:25 crc kubenswrapper[4834]: I1008 22:41:25.424015 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 08 22:41:25 crc kubenswrapper[4834]: I1008 22:41:25.566783 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e0d0702-07d0-4291-9683-6edadee6d8d8" path="/var/lib/kubelet/pods/9e0d0702-07d0-4291-9683-6edadee6d8d8/volumes" Oct 08 22:41:28 crc kubenswrapper[4834]: I1008 22:41:28.947309 4834 generic.go:334] "Generic (PLEG): container finished" podID="bedf43f2-17b7-462d-8ca1-41d4dae1e6cb" containerID="d868a94140ac15e6896c96857c31093ceddf5d4d44594dedeb5f93723c331d7d" exitCode=0 Oct 08 22:41:28 crc kubenswrapper[4834]: I1008 22:41:28.947461 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2cn89" event={"ID":"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb","Type":"ContainerDied","Data":"d868a94140ac15e6896c96857c31093ceddf5d4d44594dedeb5f93723c331d7d"} Oct 08 22:41:29 crc kubenswrapper[4834]: I1008 22:41:29.044994 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-r88vk"] Oct 08 22:41:29 crc kubenswrapper[4834]: E1008 22:41:29.045605 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78090383-28da-4042-affd-d5324d62bef4" containerName="init" Oct 08 22:41:29 crc kubenswrapper[4834]: I1008 22:41:29.045648 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="78090383-28da-4042-affd-d5324d62bef4" containerName="init" Oct 08 22:41:29 crc kubenswrapper[4834]: E1008 22:41:29.045682 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e0d0702-07d0-4291-9683-6edadee6d8d8" containerName="init" Oct 08 22:41:29 crc kubenswrapper[4834]: I1008 22:41:29.045701 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e0d0702-07d0-4291-9683-6edadee6d8d8" containerName="init" Oct 08 22:41:29 crc kubenswrapper[4834]: E1008 22:41:29.045745 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78090383-28da-4042-affd-d5324d62bef4" containerName="dnsmasq-dns" Oct 08 22:41:29 crc kubenswrapper[4834]: I1008 22:41:29.045765 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="78090383-28da-4042-affd-d5324d62bef4" containerName="dnsmasq-dns" Oct 08 22:41:29 crc kubenswrapper[4834]: I1008 22:41:29.046119 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="78090383-28da-4042-affd-d5324d62bef4" containerName="dnsmasq-dns" Oct 08 22:41:29 crc kubenswrapper[4834]: I1008 22:41:29.046207 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e0d0702-07d0-4291-9683-6edadee6d8d8" containerName="init" Oct 08 22:41:29 crc kubenswrapper[4834]: I1008 22:41:29.047101 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-r88vk" Oct 08 22:41:29 crc kubenswrapper[4834]: I1008 22:41:29.056453 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-r88vk"] Oct 08 22:41:29 crc kubenswrapper[4834]: I1008 22:41:29.186701 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp5dh\" (UniqueName: \"kubernetes.io/projected/ff1278e3-01d3-4a69-9689-caafc578bbb0-kube-api-access-kp5dh\") pod \"glance-db-create-r88vk\" (UID: \"ff1278e3-01d3-4a69-9689-caafc578bbb0\") " pod="openstack/glance-db-create-r88vk" Oct 08 22:41:29 crc kubenswrapper[4834]: I1008 22:41:29.289364 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp5dh\" (UniqueName: \"kubernetes.io/projected/ff1278e3-01d3-4a69-9689-caafc578bbb0-kube-api-access-kp5dh\") pod \"glance-db-create-r88vk\" (UID: \"ff1278e3-01d3-4a69-9689-caafc578bbb0\") " pod="openstack/glance-db-create-r88vk" Oct 08 22:41:29 crc kubenswrapper[4834]: I1008 22:41:29.324265 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp5dh\" (UniqueName: \"kubernetes.io/projected/ff1278e3-01d3-4a69-9689-caafc578bbb0-kube-api-access-kp5dh\") pod \"glance-db-create-r88vk\" (UID: \"ff1278e3-01d3-4a69-9689-caafc578bbb0\") " pod="openstack/glance-db-create-r88vk" Oct 08 22:41:29 crc kubenswrapper[4834]: I1008 22:41:29.378483 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-r88vk" Oct 08 22:41:29 crc kubenswrapper[4834]: I1008 22:41:29.912839 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-r88vk"] Oct 08 22:41:29 crc kubenswrapper[4834]: W1008 22:41:29.917453 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff1278e3_01d3_4a69_9689_caafc578bbb0.slice/crio-83b88b0014894bfe7af5dca15030b3f22565dee8c6cad5b5543f925566a1e4cc WatchSource:0}: Error finding container 83b88b0014894bfe7af5dca15030b3f22565dee8c6cad5b5543f925566a1e4cc: Status 404 returned error can't find the container with id 83b88b0014894bfe7af5dca15030b3f22565dee8c6cad5b5543f925566a1e4cc Oct 08 22:41:29 crc kubenswrapper[4834]: I1008 22:41:29.960314 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-r88vk" event={"ID":"ff1278e3-01d3-4a69-9689-caafc578bbb0","Type":"ContainerStarted","Data":"83b88b0014894bfe7af5dca15030b3f22565dee8c6cad5b5543f925566a1e4cc"} Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.311461 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:30 crc kubenswrapper[4834]: E1008 22:41:30.404355 4834 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff1278e3_01d3_4a69_9689_caafc578bbb0.slice/crio-be441ca6796f4a769b5bac5d97b9e63bade7cf01a581a51b7711992d0886f68a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff1278e3_01d3_4a69_9689_caafc578bbb0.slice/crio-conmon-be441ca6796f4a769b5bac5d97b9e63bade7cf01a581a51b7711992d0886f68a.scope\": RecentStats: unable to find data in memory cache]" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.413069 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-combined-ca-bundle\") pod \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.413156 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-scripts\") pod \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.413199 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-swiftconf\") pod \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.413379 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6ncv\" (UniqueName: \"kubernetes.io/projected/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-kube-api-access-l6ncv\") pod \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.413413 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-etc-swift\") pod \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.413470 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-ring-data-devices\") pod \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.413504 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-dispersionconf\") pod \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\" (UID: \"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb\") " Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.414298 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bedf43f2-17b7-462d-8ca1-41d4dae1e6cb" (UID: "bedf43f2-17b7-462d-8ca1-41d4dae1e6cb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.415309 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bedf43f2-17b7-462d-8ca1-41d4dae1e6cb" (UID: "bedf43f2-17b7-462d-8ca1-41d4dae1e6cb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.418304 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-kube-api-access-l6ncv" (OuterVolumeSpecName: "kube-api-access-l6ncv") pod "bedf43f2-17b7-462d-8ca1-41d4dae1e6cb" (UID: "bedf43f2-17b7-462d-8ca1-41d4dae1e6cb"). InnerVolumeSpecName "kube-api-access-l6ncv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.421973 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bedf43f2-17b7-462d-8ca1-41d4dae1e6cb" (UID: "bedf43f2-17b7-462d-8ca1-41d4dae1e6cb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.437483 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bedf43f2-17b7-462d-8ca1-41d4dae1e6cb" (UID: "bedf43f2-17b7-462d-8ca1-41d4dae1e6cb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.440563 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-scripts" (OuterVolumeSpecName: "scripts") pod "bedf43f2-17b7-462d-8ca1-41d4dae1e6cb" (UID: "bedf43f2-17b7-462d-8ca1-41d4dae1e6cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.450452 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bedf43f2-17b7-462d-8ca1-41d4dae1e6cb" (UID: "bedf43f2-17b7-462d-8ca1-41d4dae1e6cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.515270 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.515577 4834 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.515587 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6ncv\" (UniqueName: \"kubernetes.io/projected/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-kube-api-access-l6ncv\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.515597 4834 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.515607 4834 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.515615 4834 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.515622 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.795480 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.883804 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77597f887-jwkgj"] Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.884106 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77597f887-jwkgj" podUID="27ef60d6-9b76-4b61-9c92-75fa394546a0" containerName="dnsmasq-dns" containerID="cri-o://bf11ea7d1a1c554369b6a950592e53ca84eda39ef7853114ead35df7e1faa23e" gracePeriod=10 Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.971699 4834 generic.go:334] "Generic (PLEG): container finished" podID="ff1278e3-01d3-4a69-9689-caafc578bbb0" containerID="be441ca6796f4a769b5bac5d97b9e63bade7cf01a581a51b7711992d0886f68a" exitCode=0 Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.972188 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-r88vk" event={"ID":"ff1278e3-01d3-4a69-9689-caafc578bbb0","Type":"ContainerDied","Data":"be441ca6796f4a769b5bac5d97b9e63bade7cf01a581a51b7711992d0886f68a"} Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.977938 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2cn89" event={"ID":"bedf43f2-17b7-462d-8ca1-41d4dae1e6cb","Type":"ContainerDied","Data":"ded69cf5b3f39635560c0af743553280f753d8d4ba461c3759bada094e6bcd55"} Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.978000 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ded69cf5b3f39635560c0af743553280f753d8d4ba461c3759bada094e6bcd55" Oct 08 22:41:30 crc kubenswrapper[4834]: I1008 22:41:30.978126 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2cn89" Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.296824 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.337612 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27ef60d6-9b76-4b61-9c92-75fa394546a0-config\") pod \"27ef60d6-9b76-4b61-9c92-75fa394546a0\" (UID: \"27ef60d6-9b76-4b61-9c92-75fa394546a0\") " Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.338029 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zvv9\" (UniqueName: \"kubernetes.io/projected/27ef60d6-9b76-4b61-9c92-75fa394546a0-kube-api-access-9zvv9\") pod \"27ef60d6-9b76-4b61-9c92-75fa394546a0\" (UID: \"27ef60d6-9b76-4b61-9c92-75fa394546a0\") " Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.338055 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27ef60d6-9b76-4b61-9c92-75fa394546a0-dns-svc\") pod \"27ef60d6-9b76-4b61-9c92-75fa394546a0\" (UID: \"27ef60d6-9b76-4b61-9c92-75fa394546a0\") " Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.343832 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27ef60d6-9b76-4b61-9c92-75fa394546a0-kube-api-access-9zvv9" (OuterVolumeSpecName: "kube-api-access-9zvv9") pod "27ef60d6-9b76-4b61-9c92-75fa394546a0" (UID: "27ef60d6-9b76-4b61-9c92-75fa394546a0"). InnerVolumeSpecName "kube-api-access-9zvv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.391759 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27ef60d6-9b76-4b61-9c92-75fa394546a0-config" (OuterVolumeSpecName: "config") pod "27ef60d6-9b76-4b61-9c92-75fa394546a0" (UID: "27ef60d6-9b76-4b61-9c92-75fa394546a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.395072 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27ef60d6-9b76-4b61-9c92-75fa394546a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "27ef60d6-9b76-4b61-9c92-75fa394546a0" (UID: "27ef60d6-9b76-4b61-9c92-75fa394546a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.439457 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zvv9\" (UniqueName: \"kubernetes.io/projected/27ef60d6-9b76-4b61-9c92-75fa394546a0-kube-api-access-9zvv9\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.439488 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27ef60d6-9b76-4b61-9c92-75fa394546a0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.439499 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27ef60d6-9b76-4b61-9c92-75fa394546a0-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.991353 4834 generic.go:334] "Generic (PLEG): container finished" podID="27ef60d6-9b76-4b61-9c92-75fa394546a0" containerID="bf11ea7d1a1c554369b6a950592e53ca84eda39ef7853114ead35df7e1faa23e" exitCode=0 Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.991436 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-jwkgj" Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.991448 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-jwkgj" event={"ID":"27ef60d6-9b76-4b61-9c92-75fa394546a0","Type":"ContainerDied","Data":"bf11ea7d1a1c554369b6a950592e53ca84eda39ef7853114ead35df7e1faa23e"} Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.991532 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-jwkgj" event={"ID":"27ef60d6-9b76-4b61-9c92-75fa394546a0","Type":"ContainerDied","Data":"d29cabfdcec3c05e6ad3f1bd1c705f5a11e2fd028f36fd84f0de66d0a2b2cdd7"} Oct 08 22:41:31 crc kubenswrapper[4834]: I1008 22:41:31.991616 4834 scope.go:117] "RemoveContainer" containerID="bf11ea7d1a1c554369b6a950592e53ca84eda39ef7853114ead35df7e1faa23e" Oct 08 22:41:32 crc kubenswrapper[4834]: I1008 22:41:32.022662 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77597f887-jwkgj"] Oct 08 22:41:32 crc kubenswrapper[4834]: I1008 22:41:32.034650 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77597f887-jwkgj"] Oct 08 22:41:32 crc kubenswrapper[4834]: I1008 22:41:32.035315 4834 scope.go:117] "RemoveContainer" containerID="529774511ef98ce5b8395344023e355c79d0748a33f2bb65da07aeb897840f7c" Oct 08 22:41:32 crc kubenswrapper[4834]: I1008 22:41:32.064743 4834 scope.go:117] "RemoveContainer" containerID="bf11ea7d1a1c554369b6a950592e53ca84eda39ef7853114ead35df7e1faa23e" Oct 08 22:41:32 crc kubenswrapper[4834]: E1008 22:41:32.070472 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf11ea7d1a1c554369b6a950592e53ca84eda39ef7853114ead35df7e1faa23e\": container with ID starting with bf11ea7d1a1c554369b6a950592e53ca84eda39ef7853114ead35df7e1faa23e not found: ID does not exist" containerID="bf11ea7d1a1c554369b6a950592e53ca84eda39ef7853114ead35df7e1faa23e" Oct 08 22:41:32 crc kubenswrapper[4834]: I1008 22:41:32.070531 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf11ea7d1a1c554369b6a950592e53ca84eda39ef7853114ead35df7e1faa23e"} err="failed to get container status \"bf11ea7d1a1c554369b6a950592e53ca84eda39ef7853114ead35df7e1faa23e\": rpc error: code = NotFound desc = could not find container \"bf11ea7d1a1c554369b6a950592e53ca84eda39ef7853114ead35df7e1faa23e\": container with ID starting with bf11ea7d1a1c554369b6a950592e53ca84eda39ef7853114ead35df7e1faa23e not found: ID does not exist" Oct 08 22:41:32 crc kubenswrapper[4834]: I1008 22:41:32.070573 4834 scope.go:117] "RemoveContainer" containerID="529774511ef98ce5b8395344023e355c79d0748a33f2bb65da07aeb897840f7c" Oct 08 22:41:32 crc kubenswrapper[4834]: E1008 22:41:32.073688 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"529774511ef98ce5b8395344023e355c79d0748a33f2bb65da07aeb897840f7c\": container with ID starting with 529774511ef98ce5b8395344023e355c79d0748a33f2bb65da07aeb897840f7c not found: ID does not exist" containerID="529774511ef98ce5b8395344023e355c79d0748a33f2bb65da07aeb897840f7c" Oct 08 22:41:32 crc kubenswrapper[4834]: I1008 22:41:32.073729 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"529774511ef98ce5b8395344023e355c79d0748a33f2bb65da07aeb897840f7c"} err="failed to get container status \"529774511ef98ce5b8395344023e355c79d0748a33f2bb65da07aeb897840f7c\": rpc error: code = NotFound desc = could not find container \"529774511ef98ce5b8395344023e355c79d0748a33f2bb65da07aeb897840f7c\": container with ID starting with 529774511ef98ce5b8395344023e355c79d0748a33f2bb65da07aeb897840f7c not found: ID does not exist" Oct 08 22:41:32 crc kubenswrapper[4834]: I1008 22:41:32.318114 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-r88vk" Oct 08 22:41:32 crc kubenswrapper[4834]: I1008 22:41:32.358910 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp5dh\" (UniqueName: \"kubernetes.io/projected/ff1278e3-01d3-4a69-9689-caafc578bbb0-kube-api-access-kp5dh\") pod \"ff1278e3-01d3-4a69-9689-caafc578bbb0\" (UID: \"ff1278e3-01d3-4a69-9689-caafc578bbb0\") " Oct 08 22:41:32 crc kubenswrapper[4834]: I1008 22:41:32.363154 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff1278e3-01d3-4a69-9689-caafc578bbb0-kube-api-access-kp5dh" (OuterVolumeSpecName: "kube-api-access-kp5dh") pod "ff1278e3-01d3-4a69-9689-caafc578bbb0" (UID: "ff1278e3-01d3-4a69-9689-caafc578bbb0"). InnerVolumeSpecName "kube-api-access-kp5dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:32 crc kubenswrapper[4834]: I1008 22:41:32.460594 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp5dh\" (UniqueName: \"kubernetes.io/projected/ff1278e3-01d3-4a69-9689-caafc578bbb0-kube-api-access-kp5dh\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:32 crc kubenswrapper[4834]: I1008 22:41:32.561908 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:32 crc kubenswrapper[4834]: I1008 22:41:32.569901 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift\") pod \"swift-storage-0\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " pod="openstack/swift-storage-0" Oct 08 22:41:32 crc kubenswrapper[4834]: I1008 22:41:32.695892 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.001032 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-r88vk" event={"ID":"ff1278e3-01d3-4a69-9689-caafc578bbb0","Type":"ContainerDied","Data":"83b88b0014894bfe7af5dca15030b3f22565dee8c6cad5b5543f925566a1e4cc"} Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.001304 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83b88b0014894bfe7af5dca15030b3f22565dee8c6cad5b5543f925566a1e4cc" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.001065 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-r88vk" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.003123 4834 generic.go:334] "Generic (PLEG): container finished" podID="08a7721f-38a1-4a82-88ed-6f70290b5a6d" containerID="3aaaa7b4a3cb0c93a5b20371c77783e5b966e0007e153d3de4823bb1f71e3721" exitCode=0 Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.003185 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"08a7721f-38a1-4a82-88ed-6f70290b5a6d","Type":"ContainerDied","Data":"3aaaa7b4a3cb0c93a5b20371c77783e5b966e0007e153d3de4823bb1f71e3721"} Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.317257 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.360615 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-zrv7r"] Oct 08 22:41:33 crc kubenswrapper[4834]: E1008 22:41:33.361209 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bedf43f2-17b7-462d-8ca1-41d4dae1e6cb" containerName="swift-ring-rebalance" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.361231 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="bedf43f2-17b7-462d-8ca1-41d4dae1e6cb" containerName="swift-ring-rebalance" Oct 08 22:41:33 crc kubenswrapper[4834]: E1008 22:41:33.361247 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1278e3-01d3-4a69-9689-caafc578bbb0" containerName="mariadb-database-create" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.361256 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1278e3-01d3-4a69-9689-caafc578bbb0" containerName="mariadb-database-create" Oct 08 22:41:33 crc kubenswrapper[4834]: E1008 22:41:33.361270 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ef60d6-9b76-4b61-9c92-75fa394546a0" containerName="dnsmasq-dns" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.361278 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ef60d6-9b76-4b61-9c92-75fa394546a0" containerName="dnsmasq-dns" Oct 08 22:41:33 crc kubenswrapper[4834]: E1008 22:41:33.361298 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ef60d6-9b76-4b61-9c92-75fa394546a0" containerName="init" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.361306 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ef60d6-9b76-4b61-9c92-75fa394546a0" containerName="init" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.361626 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff1278e3-01d3-4a69-9689-caafc578bbb0" containerName="mariadb-database-create" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.361654 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="27ef60d6-9b76-4b61-9c92-75fa394546a0" containerName="dnsmasq-dns" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.361675 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="bedf43f2-17b7-462d-8ca1-41d4dae1e6cb" containerName="swift-ring-rebalance" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.362506 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zrv7r" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.375601 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zrv7r"] Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.478454 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km2tz\" (UniqueName: \"kubernetes.io/projected/500dac52-9a64-4864-bb56-dddfe8b82e88-kube-api-access-km2tz\") pod \"keystone-db-create-zrv7r\" (UID: \"500dac52-9a64-4864-bb56-dddfe8b82e88\") " pod="openstack/keystone-db-create-zrv7r" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.567571 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27ef60d6-9b76-4b61-9c92-75fa394546a0" path="/var/lib/kubelet/pods/27ef60d6-9b76-4b61-9c92-75fa394546a0/volumes" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.580433 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km2tz\" (UniqueName: \"kubernetes.io/projected/500dac52-9a64-4864-bb56-dddfe8b82e88-kube-api-access-km2tz\") pod \"keystone-db-create-zrv7r\" (UID: \"500dac52-9a64-4864-bb56-dddfe8b82e88\") " pod="openstack/keystone-db-create-zrv7r" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.603763 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km2tz\" (UniqueName: \"kubernetes.io/projected/500dac52-9a64-4864-bb56-dddfe8b82e88-kube-api-access-km2tz\") pod \"keystone-db-create-zrv7r\" (UID: \"500dac52-9a64-4864-bb56-dddfe8b82e88\") " pod="openstack/keystone-db-create-zrv7r" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.660310 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-gzmg7"] Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.661568 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gzmg7" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.669679 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-gzmg7"] Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.699335 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zrv7r" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.784668 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fznjg\" (UniqueName: \"kubernetes.io/projected/60bddd4c-dfc8-401c-8e6c-026cebd5703c-kube-api-access-fznjg\") pod \"placement-db-create-gzmg7\" (UID: \"60bddd4c-dfc8-401c-8e6c-026cebd5703c\") " pod="openstack/placement-db-create-gzmg7" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.886287 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fznjg\" (UniqueName: \"kubernetes.io/projected/60bddd4c-dfc8-401c-8e6c-026cebd5703c-kube-api-access-fznjg\") pod \"placement-db-create-gzmg7\" (UID: \"60bddd4c-dfc8-401c-8e6c-026cebd5703c\") " pod="openstack/placement-db-create-gzmg7" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.905312 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fznjg\" (UniqueName: \"kubernetes.io/projected/60bddd4c-dfc8-401c-8e6c-026cebd5703c-kube-api-access-fznjg\") pod \"placement-db-create-gzmg7\" (UID: \"60bddd4c-dfc8-401c-8e6c-026cebd5703c\") " pod="openstack/placement-db-create-gzmg7" Oct 08 22:41:33 crc kubenswrapper[4834]: I1008 22:41:33.981057 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gzmg7" Oct 08 22:41:34 crc kubenswrapper[4834]: I1008 22:41:34.016285 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"08a7721f-38a1-4a82-88ed-6f70290b5a6d","Type":"ContainerStarted","Data":"d45fc994c0ded46eacd3c34ef9d7bd611a887f7680ed3782c64c1af97de14c9b"} Oct 08 22:41:34 crc kubenswrapper[4834]: I1008 22:41:34.016572 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 08 22:41:34 crc kubenswrapper[4834]: I1008 22:41:34.020167 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"4a6704ede3b4e4fe0891a2208f043b040a53ddfdb0c9c606a38792df9ed39863"} Oct 08 22:41:34 crc kubenswrapper[4834]: I1008 22:41:34.026719 4834 generic.go:334] "Generic (PLEG): container finished" podID="9809d14f-10d2-479f-94d9-5b3ae7f49e7b" containerID="f498e75a1c76bc1070028f242d044a99535c4ab8ce469c8317c198d7520564ee" exitCode=0 Oct 08 22:41:34 crc kubenswrapper[4834]: I1008 22:41:34.026761 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9809d14f-10d2-479f-94d9-5b3ae7f49e7b","Type":"ContainerDied","Data":"f498e75a1c76bc1070028f242d044a99535c4ab8ce469c8317c198d7520564ee"} Oct 08 22:41:34 crc kubenswrapper[4834]: I1008 22:41:34.048850 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=43.332936894 podStartE2EDuration="56.048828757s" podCreationTimestamp="2025-10-08 22:40:38 +0000 UTC" firstStartedPulling="2025-10-08 22:40:44.902081275 +0000 UTC m=+1052.724966021" lastFinishedPulling="2025-10-08 22:40:57.617973128 +0000 UTC m=+1065.440857884" observedRunningTime="2025-10-08 22:41:34.038362141 +0000 UTC m=+1101.861246887" watchObservedRunningTime="2025-10-08 22:41:34.048828757 +0000 UTC m=+1101.871713503" Oct 08 22:41:34 crc kubenswrapper[4834]: I1008 22:41:34.124572 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zrv7r"] Oct 08 22:41:34 crc kubenswrapper[4834]: W1008 22:41:34.249917 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod500dac52_9a64_4864_bb56_dddfe8b82e88.slice/crio-d35919ca44a521658d5219df79867894c0f43d40b05c9a91186c80e937728f66 WatchSource:0}: Error finding container d35919ca44a521658d5219df79867894c0f43d40b05c9a91186c80e937728f66: Status 404 returned error can't find the container with id d35919ca44a521658d5219df79867894c0f43d40b05c9a91186c80e937728f66 Oct 08 22:41:34 crc kubenswrapper[4834]: I1008 22:41:34.768055 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-gzmg7"] Oct 08 22:41:35 crc kubenswrapper[4834]: I1008 22:41:35.034627 4834 generic.go:334] "Generic (PLEG): container finished" podID="60bddd4c-dfc8-401c-8e6c-026cebd5703c" containerID="af52d1bc1732e50cef0ae70a87734710490b056806da58f0bfa53ce4b2072fa4" exitCode=0 Oct 08 22:41:35 crc kubenswrapper[4834]: I1008 22:41:35.034698 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gzmg7" event={"ID":"60bddd4c-dfc8-401c-8e6c-026cebd5703c","Type":"ContainerDied","Data":"af52d1bc1732e50cef0ae70a87734710490b056806da58f0bfa53ce4b2072fa4"} Oct 08 22:41:35 crc kubenswrapper[4834]: I1008 22:41:35.034742 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gzmg7" event={"ID":"60bddd4c-dfc8-401c-8e6c-026cebd5703c","Type":"ContainerStarted","Data":"2656e618ea1c62623402f6d9cbddd5de2e1c9227bbd2ae0d9ea03a63b9eaf406"} Oct 08 22:41:35 crc kubenswrapper[4834]: I1008 22:41:35.044974 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"7514b6a8989d60bf273471f9730c93e60d545194b6408566793dc09f41f4ed2c"} Oct 08 22:41:35 crc kubenswrapper[4834]: I1008 22:41:35.045042 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"06fd06ac9cbc7b6f70094375f8d6b9bf90833b141ddadf2cad95017b587d05a9"} Oct 08 22:41:35 crc kubenswrapper[4834]: I1008 22:41:35.046774 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9809d14f-10d2-479f-94d9-5b3ae7f49e7b","Type":"ContainerStarted","Data":"5d9514045c935fa894162fd45b39ec094f493d6cb728ac60cb7260b0bb6a23fa"} Oct 08 22:41:35 crc kubenswrapper[4834]: I1008 22:41:35.047051 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:41:35 crc kubenswrapper[4834]: I1008 22:41:35.049514 4834 generic.go:334] "Generic (PLEG): container finished" podID="500dac52-9a64-4864-bb56-dddfe8b82e88" containerID="dd65c7ce1dc2a6b52c253c3d1d5fdcb37949e830b5563bff431e26264e98a133" exitCode=0 Oct 08 22:41:35 crc kubenswrapper[4834]: I1008 22:41:35.049644 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zrv7r" event={"ID":"500dac52-9a64-4864-bb56-dddfe8b82e88","Type":"ContainerDied","Data":"dd65c7ce1dc2a6b52c253c3d1d5fdcb37949e830b5563bff431e26264e98a133"} Oct 08 22:41:35 crc kubenswrapper[4834]: I1008 22:41:35.049694 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zrv7r" event={"ID":"500dac52-9a64-4864-bb56-dddfe8b82e88","Type":"ContainerStarted","Data":"d35919ca44a521658d5219df79867894c0f43d40b05c9a91186c80e937728f66"} Oct 08 22:41:35 crc kubenswrapper[4834]: I1008 22:41:35.087267 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=56.087118531 podStartE2EDuration="56.087118531s" podCreationTimestamp="2025-10-08 22:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:41:35.083417961 +0000 UTC m=+1102.906302727" watchObservedRunningTime="2025-10-08 22:41:35.087118531 +0000 UTC m=+1102.910003267" Oct 08 22:41:36 crc kubenswrapper[4834]: I1008 22:41:36.058045 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"32976af02fd539929cf231bf96ee9923fbea70a134e86a660abf29e813f31e4c"} Oct 08 22:41:36 crc kubenswrapper[4834]: I1008 22:41:36.058436 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"5523f6432e877d33c1dd624b005e3a10c13f5c90d768a9ec5b33eb492f9cd80b"} Oct 08 22:41:36 crc kubenswrapper[4834]: I1008 22:41:36.391937 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 08 22:41:36 crc kubenswrapper[4834]: I1008 22:41:36.526123 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zrv7r" Oct 08 22:41:36 crc kubenswrapper[4834]: I1008 22:41:36.529277 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gzmg7" Oct 08 22:41:36 crc kubenswrapper[4834]: I1008 22:41:36.660198 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fznjg\" (UniqueName: \"kubernetes.io/projected/60bddd4c-dfc8-401c-8e6c-026cebd5703c-kube-api-access-fznjg\") pod \"60bddd4c-dfc8-401c-8e6c-026cebd5703c\" (UID: \"60bddd4c-dfc8-401c-8e6c-026cebd5703c\") " Oct 08 22:41:36 crc kubenswrapper[4834]: I1008 22:41:36.660282 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km2tz\" (UniqueName: \"kubernetes.io/projected/500dac52-9a64-4864-bb56-dddfe8b82e88-kube-api-access-km2tz\") pod \"500dac52-9a64-4864-bb56-dddfe8b82e88\" (UID: \"500dac52-9a64-4864-bb56-dddfe8b82e88\") " Oct 08 22:41:36 crc kubenswrapper[4834]: I1008 22:41:36.670080 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60bddd4c-dfc8-401c-8e6c-026cebd5703c-kube-api-access-fznjg" (OuterVolumeSpecName: "kube-api-access-fznjg") pod "60bddd4c-dfc8-401c-8e6c-026cebd5703c" (UID: "60bddd4c-dfc8-401c-8e6c-026cebd5703c"). InnerVolumeSpecName "kube-api-access-fznjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:36 crc kubenswrapper[4834]: I1008 22:41:36.670185 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500dac52-9a64-4864-bb56-dddfe8b82e88-kube-api-access-km2tz" (OuterVolumeSpecName: "kube-api-access-km2tz") pod "500dac52-9a64-4864-bb56-dddfe8b82e88" (UID: "500dac52-9a64-4864-bb56-dddfe8b82e88"). InnerVolumeSpecName "kube-api-access-km2tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:36 crc kubenswrapper[4834]: I1008 22:41:36.762071 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fznjg\" (UniqueName: \"kubernetes.io/projected/60bddd4c-dfc8-401c-8e6c-026cebd5703c-kube-api-access-fznjg\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:36 crc kubenswrapper[4834]: I1008 22:41:36.762544 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km2tz\" (UniqueName: \"kubernetes.io/projected/500dac52-9a64-4864-bb56-dddfe8b82e88-kube-api-access-km2tz\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:37 crc kubenswrapper[4834]: I1008 22:41:37.065220 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gzmg7" Oct 08 22:41:37 crc kubenswrapper[4834]: I1008 22:41:37.065232 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gzmg7" event={"ID":"60bddd4c-dfc8-401c-8e6c-026cebd5703c","Type":"ContainerDied","Data":"2656e618ea1c62623402f6d9cbddd5de2e1c9227bbd2ae0d9ea03a63b9eaf406"} Oct 08 22:41:37 crc kubenswrapper[4834]: I1008 22:41:37.065284 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2656e618ea1c62623402f6d9cbddd5de2e1c9227bbd2ae0d9ea03a63b9eaf406" Oct 08 22:41:37 crc kubenswrapper[4834]: I1008 22:41:37.073174 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"f652c6fc5992ba73c5161286567da0fcc0cd2540cfc9653f9d3e00b5ca106caa"} Oct 08 22:41:37 crc kubenswrapper[4834]: I1008 22:41:37.073217 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"6ce010fd5f1a7322ee19e0d85fe1859b18c483f1e6cee129d2324a72aca8c9ae"} Oct 08 22:41:37 crc kubenswrapper[4834]: I1008 22:41:37.073229 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"2091dbf498187923ad07111022a998400d80f39064fb32a84e0b373a124b3d4d"} Oct 08 22:41:37 crc kubenswrapper[4834]: I1008 22:41:37.073240 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"d8010142af61d9a4bcf191642a46ef203b560507fdc751d54f09b14df4705c8f"} Oct 08 22:41:37 crc kubenswrapper[4834]: I1008 22:41:37.077248 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zrv7r" event={"ID":"500dac52-9a64-4864-bb56-dddfe8b82e88","Type":"ContainerDied","Data":"d35919ca44a521658d5219df79867894c0f43d40b05c9a91186c80e937728f66"} Oct 08 22:41:37 crc kubenswrapper[4834]: I1008 22:41:37.077279 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d35919ca44a521658d5219df79867894c0f43d40b05c9a91186c80e937728f66" Oct 08 22:41:37 crc kubenswrapper[4834]: I1008 22:41:37.077335 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zrv7r" Oct 08 22:41:39 crc kubenswrapper[4834]: I1008 22:41:39.088558 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9716-account-create-cjmxg"] Oct 08 22:41:39 crc kubenswrapper[4834]: E1008 22:41:39.089558 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500dac52-9a64-4864-bb56-dddfe8b82e88" containerName="mariadb-database-create" Oct 08 22:41:39 crc kubenswrapper[4834]: I1008 22:41:39.089589 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="500dac52-9a64-4864-bb56-dddfe8b82e88" containerName="mariadb-database-create" Oct 08 22:41:39 crc kubenswrapper[4834]: E1008 22:41:39.089612 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60bddd4c-dfc8-401c-8e6c-026cebd5703c" containerName="mariadb-database-create" Oct 08 22:41:39 crc kubenswrapper[4834]: I1008 22:41:39.089624 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="60bddd4c-dfc8-401c-8e6c-026cebd5703c" containerName="mariadb-database-create" Oct 08 22:41:39 crc kubenswrapper[4834]: I1008 22:41:39.089932 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="500dac52-9a64-4864-bb56-dddfe8b82e88" containerName="mariadb-database-create" Oct 08 22:41:39 crc kubenswrapper[4834]: I1008 22:41:39.089984 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="60bddd4c-dfc8-401c-8e6c-026cebd5703c" containerName="mariadb-database-create" Oct 08 22:41:39 crc kubenswrapper[4834]: I1008 22:41:39.090886 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9716-account-create-cjmxg" Oct 08 22:41:39 crc kubenswrapper[4834]: I1008 22:41:39.094511 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 08 22:41:39 crc kubenswrapper[4834]: I1008 22:41:39.115286 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9716-account-create-cjmxg"] Oct 08 22:41:39 crc kubenswrapper[4834]: I1008 22:41:39.206783 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4925z\" (UniqueName: \"kubernetes.io/projected/739f746e-d763-46b9-9512-1c8dde821ada-kube-api-access-4925z\") pod \"glance-9716-account-create-cjmxg\" (UID: \"739f746e-d763-46b9-9512-1c8dde821ada\") " pod="openstack/glance-9716-account-create-cjmxg" Oct 08 22:41:39 crc kubenswrapper[4834]: I1008 22:41:39.308689 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4925z\" (UniqueName: \"kubernetes.io/projected/739f746e-d763-46b9-9512-1c8dde821ada-kube-api-access-4925z\") pod \"glance-9716-account-create-cjmxg\" (UID: \"739f746e-d763-46b9-9512-1c8dde821ada\") " pod="openstack/glance-9716-account-create-cjmxg" Oct 08 22:41:39 crc kubenswrapper[4834]: I1008 22:41:39.344934 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4925z\" (UniqueName: \"kubernetes.io/projected/739f746e-d763-46b9-9512-1c8dde821ada-kube-api-access-4925z\") pod \"glance-9716-account-create-cjmxg\" (UID: \"739f746e-d763-46b9-9512-1c8dde821ada\") " pod="openstack/glance-9716-account-create-cjmxg" Oct 08 22:41:39 crc kubenswrapper[4834]: I1008 22:41:39.423848 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9716-account-create-cjmxg" Oct 08 22:41:42 crc kubenswrapper[4834]: I1008 22:41:42.303573 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9716-account-create-cjmxg"] Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.508378 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2c76-account-create-q2px5"] Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.510595 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2c76-account-create-q2px5" Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.513080 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.521715 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2c76-account-create-q2px5"] Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.687545 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn72b\" (UniqueName: \"kubernetes.io/projected/5efb48ec-b3dd-4646-b813-007d45c94ad2-kube-api-access-xn72b\") pod \"keystone-2c76-account-create-q2px5\" (UID: \"5efb48ec-b3dd-4646-b813-007d45c94ad2\") " pod="openstack/keystone-2c76-account-create-q2px5" Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.700755 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3088-account-create-r9mqq"] Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.702490 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3088-account-create-r9mqq" Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.706118 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.719560 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3088-account-create-r9mqq"] Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.789229 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q828\" (UniqueName: \"kubernetes.io/projected/b9226bc4-6b16-45cd-a31a-163ad9b5aa53-kube-api-access-9q828\") pod \"placement-3088-account-create-r9mqq\" (UID: \"b9226bc4-6b16-45cd-a31a-163ad9b5aa53\") " pod="openstack/placement-3088-account-create-r9mqq" Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.789322 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn72b\" (UniqueName: \"kubernetes.io/projected/5efb48ec-b3dd-4646-b813-007d45c94ad2-kube-api-access-xn72b\") pod \"keystone-2c76-account-create-q2px5\" (UID: \"5efb48ec-b3dd-4646-b813-007d45c94ad2\") " pod="openstack/keystone-2c76-account-create-q2px5" Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.811738 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn72b\" (UniqueName: \"kubernetes.io/projected/5efb48ec-b3dd-4646-b813-007d45c94ad2-kube-api-access-xn72b\") pod \"keystone-2c76-account-create-q2px5\" (UID: \"5efb48ec-b3dd-4646-b813-007d45c94ad2\") " pod="openstack/keystone-2c76-account-create-q2px5" Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.845515 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2c76-account-create-q2px5" Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.891091 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q828\" (UniqueName: \"kubernetes.io/projected/b9226bc4-6b16-45cd-a31a-163ad9b5aa53-kube-api-access-9q828\") pod \"placement-3088-account-create-r9mqq\" (UID: \"b9226bc4-6b16-45cd-a31a-163ad9b5aa53\") " pod="openstack/placement-3088-account-create-r9mqq" Oct 08 22:41:43 crc kubenswrapper[4834]: I1008 22:41:43.907987 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q828\" (UniqueName: \"kubernetes.io/projected/b9226bc4-6b16-45cd-a31a-163ad9b5aa53-kube-api-access-9q828\") pod \"placement-3088-account-create-r9mqq\" (UID: \"b9226bc4-6b16-45cd-a31a-163ad9b5aa53\") " pod="openstack/placement-3088-account-create-r9mqq" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.017868 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3088-account-create-r9mqq" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.176885 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9716-account-create-cjmxg" event={"ID":"739f746e-d763-46b9-9512-1c8dde821ada","Type":"ContainerStarted","Data":"2859b1a44f5eef71fc90927880e72a4636d22a5f334fff49465eab47f062e0e3"} Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.241810 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.253539 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.272569 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-7bm6j" podUID="74f0068c-4e61-4079-9d62-b338472e817d" containerName="ovn-controller" probeResult="failure" output=< Oct 08 22:41:44 crc kubenswrapper[4834]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 08 22:41:44 crc kubenswrapper[4834]: > Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.377790 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2c76-account-create-q2px5"] Oct 08 22:41:44 crc kubenswrapper[4834]: W1008 22:41:44.382304 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5efb48ec_b3dd_4646_b813_007d45c94ad2.slice/crio-5d4313e7b2b7fb76ce7b1f25b3cd498c26927a9ab168808cedbc5591d5881a80 WatchSource:0}: Error finding container 5d4313e7b2b7fb76ce7b1f25b3cd498c26927a9ab168808cedbc5591d5881a80: Status 404 returned error can't find the container with id 5d4313e7b2b7fb76ce7b1f25b3cd498c26927a9ab168808cedbc5591d5881a80 Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.497412 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7bm6j-config-pd5w6"] Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.503931 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.508067 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.556201 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7bm6j-config-pd5w6"] Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.582349 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3088-account-create-r9mqq"] Oct 08 22:41:44 crc kubenswrapper[4834]: W1008 22:41:44.582910 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9226bc4_6b16_45cd_a31a_163ad9b5aa53.slice/crio-88b9f098136b815ac74b459e9d9411e1bc990255241ef4dfae27bfe8ec1d2d1a WatchSource:0}: Error finding container 88b9f098136b815ac74b459e9d9411e1bc990255241ef4dfae27bfe8ec1d2d1a: Status 404 returned error can't find the container with id 88b9f098136b815ac74b459e9d9411e1bc990255241ef4dfae27bfe8ec1d2d1a Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.603425 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-run-ovn\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.603480 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-log-ovn\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.603546 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-additional-scripts\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.603582 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-scripts\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.603604 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrmt6\" (UniqueName: \"kubernetes.io/projected/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-kube-api-access-zrmt6\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.603658 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-run\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.705395 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-run\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.705498 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-run-ovn\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.705530 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-log-ovn\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.705586 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-additional-scripts\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.705624 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-scripts\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.705645 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrmt6\" (UniqueName: \"kubernetes.io/projected/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-kube-api-access-zrmt6\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.706258 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-run\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.706318 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-run-ovn\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.706361 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-log-ovn\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.707092 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-additional-scripts\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.713154 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-scripts\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.738532 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrmt6\" (UniqueName: \"kubernetes.io/projected/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-kube-api-access-zrmt6\") pod \"ovn-controller-7bm6j-config-pd5w6\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:44 crc kubenswrapper[4834]: I1008 22:41:44.835615 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:45 crc kubenswrapper[4834]: I1008 22:41:45.187022 4834 generic.go:334] "Generic (PLEG): container finished" podID="739f746e-d763-46b9-9512-1c8dde821ada" containerID="0c9ac7a53523f40364b8d33dc72f9c865f268d08e573b6598312922e667c5174" exitCode=0 Oct 08 22:41:45 crc kubenswrapper[4834]: I1008 22:41:45.187408 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9716-account-create-cjmxg" event={"ID":"739f746e-d763-46b9-9512-1c8dde821ada","Type":"ContainerDied","Data":"0c9ac7a53523f40364b8d33dc72f9c865f268d08e573b6598312922e667c5174"} Oct 08 22:41:45 crc kubenswrapper[4834]: I1008 22:41:45.199222 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"3f5e1b6ecf4bdb7095f6e4f54237dd8d5e0fce913b21ceb7e2e9d3bbe4da4702"} Oct 08 22:41:45 crc kubenswrapper[4834]: I1008 22:41:45.199254 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"916771be477f120fd96b5a1f5443d68a2ee7c86b4f15262feb8c7da880418d12"} Oct 08 22:41:45 crc kubenswrapper[4834]: I1008 22:41:45.205921 4834 generic.go:334] "Generic (PLEG): container finished" podID="5efb48ec-b3dd-4646-b813-007d45c94ad2" containerID="251b7ba4f4bb3ba2c276ea693fd961db641b65c3db7647c9aed9146f4376a84b" exitCode=0 Oct 08 22:41:45 crc kubenswrapper[4834]: I1008 22:41:45.206125 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2c76-account-create-q2px5" event={"ID":"5efb48ec-b3dd-4646-b813-007d45c94ad2","Type":"ContainerDied","Data":"251b7ba4f4bb3ba2c276ea693fd961db641b65c3db7647c9aed9146f4376a84b"} Oct 08 22:41:45 crc kubenswrapper[4834]: I1008 22:41:45.206381 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2c76-account-create-q2px5" event={"ID":"5efb48ec-b3dd-4646-b813-007d45c94ad2","Type":"ContainerStarted","Data":"5d4313e7b2b7fb76ce7b1f25b3cd498c26927a9ab168808cedbc5591d5881a80"} Oct 08 22:41:45 crc kubenswrapper[4834]: I1008 22:41:45.211707 4834 generic.go:334] "Generic (PLEG): container finished" podID="b9226bc4-6b16-45cd-a31a-163ad9b5aa53" containerID="cfae18dfe7163f5270b1b9a333bf53a39b350a54593ea7ad6ea84c4a926bd51d" exitCode=0 Oct 08 22:41:45 crc kubenswrapper[4834]: I1008 22:41:45.212943 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3088-account-create-r9mqq" event={"ID":"b9226bc4-6b16-45cd-a31a-163ad9b5aa53","Type":"ContainerDied","Data":"cfae18dfe7163f5270b1b9a333bf53a39b350a54593ea7ad6ea84c4a926bd51d"} Oct 08 22:41:45 crc kubenswrapper[4834]: I1008 22:41:45.212983 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3088-account-create-r9mqq" event={"ID":"b9226bc4-6b16-45cd-a31a-163ad9b5aa53","Type":"ContainerStarted","Data":"88b9f098136b815ac74b459e9d9411e1bc990255241ef4dfae27bfe8ec1d2d1a"} Oct 08 22:41:45 crc kubenswrapper[4834]: I1008 22:41:45.384441 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7bm6j-config-pd5w6"] Oct 08 22:41:45 crc kubenswrapper[4834]: W1008 22:41:45.388911 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda52e86a8_cd54_4dc9_8920_9c1f53bf86c1.slice/crio-58a39800596c7a58c8b671bd527458a4a318b74e241b559137b92125b2ff002f WatchSource:0}: Error finding container 58a39800596c7a58c8b671bd527458a4a318b74e241b559137b92125b2ff002f: Status 404 returned error can't find the container with id 58a39800596c7a58c8b671bd527458a4a318b74e241b559137b92125b2ff002f Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.228447 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"3fd81fba6fef2c5e63023d49c353e82227e744ef65163a821e731e717fb6624a"} Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.228820 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"fd0fe5fab5d6566b676108078b9bb8956f0d4000e18eeb98d65adfb995ed66db"} Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.228834 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"cc2d791ec077a375cc5534fb4333d22ce2bfa09e913fe297560734c663737cc0"} Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.228846 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"d032b4a1902d533ed5cdb4bbf1a9f2c37094a94a35f3ce8f02ac5f212367ceee"} Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.228858 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerStarted","Data":"f098a2e0cb0624cb3fe14c1c933baa20366c8520017da6de05a7436114e9e875"} Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.231369 4834 generic.go:334] "Generic (PLEG): container finished" podID="a52e86a8-cd54-4dc9-8920-9c1f53bf86c1" containerID="477d72b2a6782571ffb4926c62d280ced0b20678e4a5c010ea8640ecaae5b71b" exitCode=0 Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.231420 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7bm6j-config-pd5w6" event={"ID":"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1","Type":"ContainerDied","Data":"477d72b2a6782571ffb4926c62d280ced0b20678e4a5c010ea8640ecaae5b71b"} Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.231453 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7bm6j-config-pd5w6" event={"ID":"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1","Type":"ContainerStarted","Data":"58a39800596c7a58c8b671bd527458a4a318b74e241b559137b92125b2ff002f"} Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.291478 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.53930684 podStartE2EDuration="31.291458089s" podCreationTimestamp="2025-10-08 22:41:15 +0000 UTC" firstStartedPulling="2025-10-08 22:41:33.32502626 +0000 UTC m=+1101.147911026" lastFinishedPulling="2025-10-08 22:41:44.077177519 +0000 UTC m=+1111.900062275" observedRunningTime="2025-10-08 22:41:46.279676684 +0000 UTC m=+1114.102561450" watchObservedRunningTime="2025-10-08 22:41:46.291458089 +0000 UTC m=+1114.114342845" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.609362 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59f45f6cf7-4gxlc"] Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.611307 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.613185 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.634368 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59f45f6cf7-4gxlc"] Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.698946 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3088-account-create-r9mqq" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.718346 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2c76-account-create-q2px5" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.724360 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9716-account-create-cjmxg" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.746609 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-dns-svc\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.746695 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-config\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.746786 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlpfr\" (UniqueName: \"kubernetes.io/projected/532b6dee-483f-40e2-a1a6-d2d9af582e97-kube-api-access-dlpfr\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.746885 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-ovsdbserver-sb\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.746994 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-dns-swift-storage-0\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.747132 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-ovsdbserver-nb\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.848170 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4925z\" (UniqueName: \"kubernetes.io/projected/739f746e-d763-46b9-9512-1c8dde821ada-kube-api-access-4925z\") pod \"739f746e-d763-46b9-9512-1c8dde821ada\" (UID: \"739f746e-d763-46b9-9512-1c8dde821ada\") " Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.848322 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q828\" (UniqueName: \"kubernetes.io/projected/b9226bc4-6b16-45cd-a31a-163ad9b5aa53-kube-api-access-9q828\") pod \"b9226bc4-6b16-45cd-a31a-163ad9b5aa53\" (UID: \"b9226bc4-6b16-45cd-a31a-163ad9b5aa53\") " Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.848376 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn72b\" (UniqueName: \"kubernetes.io/projected/5efb48ec-b3dd-4646-b813-007d45c94ad2-kube-api-access-xn72b\") pod \"5efb48ec-b3dd-4646-b813-007d45c94ad2\" (UID: \"5efb48ec-b3dd-4646-b813-007d45c94ad2\") " Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.848597 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-dns-svc\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.848625 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-config\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.848661 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlpfr\" (UniqueName: \"kubernetes.io/projected/532b6dee-483f-40e2-a1a6-d2d9af582e97-kube-api-access-dlpfr\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.848699 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-ovsdbserver-sb\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.848741 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-dns-swift-storage-0\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.848792 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-ovsdbserver-nb\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.849620 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-ovsdbserver-nb\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.851519 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-ovsdbserver-sb\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.852098 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-dns-swift-storage-0\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.852220 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-dns-svc\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.853519 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-config\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.857056 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9226bc4-6b16-45cd-a31a-163ad9b5aa53-kube-api-access-9q828" (OuterVolumeSpecName: "kube-api-access-9q828") pod "b9226bc4-6b16-45cd-a31a-163ad9b5aa53" (UID: "b9226bc4-6b16-45cd-a31a-163ad9b5aa53"). InnerVolumeSpecName "kube-api-access-9q828". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.857228 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5efb48ec-b3dd-4646-b813-007d45c94ad2-kube-api-access-xn72b" (OuterVolumeSpecName: "kube-api-access-xn72b") pod "5efb48ec-b3dd-4646-b813-007d45c94ad2" (UID: "5efb48ec-b3dd-4646-b813-007d45c94ad2"). InnerVolumeSpecName "kube-api-access-xn72b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.858168 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739f746e-d763-46b9-9512-1c8dde821ada-kube-api-access-4925z" (OuterVolumeSpecName: "kube-api-access-4925z") pod "739f746e-d763-46b9-9512-1c8dde821ada" (UID: "739f746e-d763-46b9-9512-1c8dde821ada"). InnerVolumeSpecName "kube-api-access-4925z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.877440 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlpfr\" (UniqueName: \"kubernetes.io/projected/532b6dee-483f-40e2-a1a6-d2d9af582e97-kube-api-access-dlpfr\") pod \"dnsmasq-dns-59f45f6cf7-4gxlc\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.950264 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q828\" (UniqueName: \"kubernetes.io/projected/b9226bc4-6b16-45cd-a31a-163ad9b5aa53-kube-api-access-9q828\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.950297 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn72b\" (UniqueName: \"kubernetes.io/projected/5efb48ec-b3dd-4646-b813-007d45c94ad2-kube-api-access-xn72b\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:46 crc kubenswrapper[4834]: I1008 22:41:46.950306 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4925z\" (UniqueName: \"kubernetes.io/projected/739f746e-d763-46b9-9512-1c8dde821ada-kube-api-access-4925z\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.011687 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.026406 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.026475 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.241692 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2c76-account-create-q2px5" event={"ID":"5efb48ec-b3dd-4646-b813-007d45c94ad2","Type":"ContainerDied","Data":"5d4313e7b2b7fb76ce7b1f25b3cd498c26927a9ab168808cedbc5591d5881a80"} Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.242146 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d4313e7b2b7fb76ce7b1f25b3cd498c26927a9ab168808cedbc5591d5881a80" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.241973 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2c76-account-create-q2px5" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.244073 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3088-account-create-r9mqq" event={"ID":"b9226bc4-6b16-45cd-a31a-163ad9b5aa53","Type":"ContainerDied","Data":"88b9f098136b815ac74b459e9d9411e1bc990255241ef4dfae27bfe8ec1d2d1a"} Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.244137 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88b9f098136b815ac74b459e9d9411e1bc990255241ef4dfae27bfe8ec1d2d1a" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.244200 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3088-account-create-r9mqq" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.248470 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9716-account-create-cjmxg" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.248752 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9716-account-create-cjmxg" event={"ID":"739f746e-d763-46b9-9512-1c8dde821ada","Type":"ContainerDied","Data":"2859b1a44f5eef71fc90927880e72a4636d22a5f334fff49465eab47f062e0e3"} Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.248810 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2859b1a44f5eef71fc90927880e72a4636d22a5f334fff49465eab47f062e0e3" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.540063 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59f45f6cf7-4gxlc"] Oct 08 22:41:47 crc kubenswrapper[4834]: W1008 22:41:47.553076 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod532b6dee_483f_40e2_a1a6_d2d9af582e97.slice/crio-4e9c98aada1cc187a20ba326096da1f6db2366f0849c47c1b3206e018d283c3b WatchSource:0}: Error finding container 4e9c98aada1cc187a20ba326096da1f6db2366f0849c47c1b3206e018d283c3b: Status 404 returned error can't find the container with id 4e9c98aada1cc187a20ba326096da1f6db2366f0849c47c1b3206e018d283c3b Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.589365 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.664569 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-run-ovn\") pod \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.664739 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a52e86a8-cd54-4dc9-8920-9c1f53bf86c1" (UID: "a52e86a8-cd54-4dc9-8920-9c1f53bf86c1"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.665060 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-run\") pod \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.665127 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-run" (OuterVolumeSpecName: "var-run") pod "a52e86a8-cd54-4dc9-8920-9c1f53bf86c1" (UID: "a52e86a8-cd54-4dc9-8920-9c1f53bf86c1"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.665196 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-log-ovn\") pod \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.665261 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a52e86a8-cd54-4dc9-8920-9c1f53bf86c1" (UID: "a52e86a8-cd54-4dc9-8920-9c1f53bf86c1"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.665308 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-scripts\") pod \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.665415 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrmt6\" (UniqueName: \"kubernetes.io/projected/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-kube-api-access-zrmt6\") pod \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.665489 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-additional-scripts\") pod \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\" (UID: \"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1\") " Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.665929 4834 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.665951 4834 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.665963 4834 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-var-run\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.666273 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a52e86a8-cd54-4dc9-8920-9c1f53bf86c1" (UID: "a52e86a8-cd54-4dc9-8920-9c1f53bf86c1"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.666527 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-scripts" (OuterVolumeSpecName: "scripts") pod "a52e86a8-cd54-4dc9-8920-9c1f53bf86c1" (UID: "a52e86a8-cd54-4dc9-8920-9c1f53bf86c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.671596 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-kube-api-access-zrmt6" (OuterVolumeSpecName: "kube-api-access-zrmt6") pod "a52e86a8-cd54-4dc9-8920-9c1f53bf86c1" (UID: "a52e86a8-cd54-4dc9-8920-9c1f53bf86c1"). InnerVolumeSpecName "kube-api-access-zrmt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.767683 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrmt6\" (UniqueName: \"kubernetes.io/projected/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-kube-api-access-zrmt6\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.767723 4834 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:47 crc kubenswrapper[4834]: I1008 22:41:47.767741 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:48 crc kubenswrapper[4834]: I1008 22:41:48.260767 4834 generic.go:334] "Generic (PLEG): container finished" podID="532b6dee-483f-40e2-a1a6-d2d9af582e97" containerID="61a624379d927b98ebda00e533f085b2f8f5d292801d76fb051fcf85248dad09" exitCode=0 Oct 08 22:41:48 crc kubenswrapper[4834]: I1008 22:41:48.260864 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" event={"ID":"532b6dee-483f-40e2-a1a6-d2d9af582e97","Type":"ContainerDied","Data":"61a624379d927b98ebda00e533f085b2f8f5d292801d76fb051fcf85248dad09"} Oct 08 22:41:48 crc kubenswrapper[4834]: I1008 22:41:48.260897 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" event={"ID":"532b6dee-483f-40e2-a1a6-d2d9af582e97","Type":"ContainerStarted","Data":"4e9c98aada1cc187a20ba326096da1f6db2366f0849c47c1b3206e018d283c3b"} Oct 08 22:41:48 crc kubenswrapper[4834]: I1008 22:41:48.263804 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7bm6j-config-pd5w6" event={"ID":"a52e86a8-cd54-4dc9-8920-9c1f53bf86c1","Type":"ContainerDied","Data":"58a39800596c7a58c8b671bd527458a4a318b74e241b559137b92125b2ff002f"} Oct 08 22:41:48 crc kubenswrapper[4834]: I1008 22:41:48.263828 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58a39800596c7a58c8b671bd527458a4a318b74e241b559137b92125b2ff002f" Oct 08 22:41:48 crc kubenswrapper[4834]: I1008 22:41:48.263854 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7bm6j-config-pd5w6" Oct 08 22:41:48 crc kubenswrapper[4834]: I1008 22:41:48.684447 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-7bm6j-config-pd5w6"] Oct 08 22:41:48 crc kubenswrapper[4834]: I1008 22:41:48.697547 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-7bm6j-config-pd5w6"] Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.241213 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-7bm6j" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.273337 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" event={"ID":"532b6dee-483f-40e2-a1a6-d2d9af582e97","Type":"ContainerStarted","Data":"bda88da6127166843e209ec22e5eb1b0a712814c0eb0fbc484f3b3b9baa29e6d"} Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.273482 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.306837 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" podStartSLOduration=3.306815312 podStartE2EDuration="3.306815312s" podCreationTimestamp="2025-10-08 22:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:41:49.297901598 +0000 UTC m=+1117.120786364" watchObservedRunningTime="2025-10-08 22:41:49.306815312 +0000 UTC m=+1117.129700058" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.309664 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-dcw47"] Oct 08 22:41:49 crc kubenswrapper[4834]: E1008 22:41:49.309968 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739f746e-d763-46b9-9512-1c8dde821ada" containerName="mariadb-account-create" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.309989 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="739f746e-d763-46b9-9512-1c8dde821ada" containerName="mariadb-account-create" Oct 08 22:41:49 crc kubenswrapper[4834]: E1008 22:41:49.310013 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5efb48ec-b3dd-4646-b813-007d45c94ad2" containerName="mariadb-account-create" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.310020 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="5efb48ec-b3dd-4646-b813-007d45c94ad2" containerName="mariadb-account-create" Oct 08 22:41:49 crc kubenswrapper[4834]: E1008 22:41:49.310032 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52e86a8-cd54-4dc9-8920-9c1f53bf86c1" containerName="ovn-config" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.310037 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52e86a8-cd54-4dc9-8920-9c1f53bf86c1" containerName="ovn-config" Oct 08 22:41:49 crc kubenswrapper[4834]: E1008 22:41:49.310049 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9226bc4-6b16-45cd-a31a-163ad9b5aa53" containerName="mariadb-account-create" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.310055 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9226bc4-6b16-45cd-a31a-163ad9b5aa53" containerName="mariadb-account-create" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.310213 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="739f746e-d763-46b9-9512-1c8dde821ada" containerName="mariadb-account-create" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.310236 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9226bc4-6b16-45cd-a31a-163ad9b5aa53" containerName="mariadb-account-create" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.310248 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="5efb48ec-b3dd-4646-b813-007d45c94ad2" containerName="mariadb-account-create" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.310255 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52e86a8-cd54-4dc9-8920-9c1f53bf86c1" containerName="ovn-config" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.310744 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dcw47" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.313540 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.314333 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fq748" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.322709 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dcw47"] Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.396481 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-db-sync-config-data\") pod \"glance-db-sync-dcw47\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " pod="openstack/glance-db-sync-dcw47" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.396664 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-config-data\") pod \"glance-db-sync-dcw47\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " pod="openstack/glance-db-sync-dcw47" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.396843 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-combined-ca-bundle\") pod \"glance-db-sync-dcw47\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " pod="openstack/glance-db-sync-dcw47" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.396979 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqtc6\" (UniqueName: \"kubernetes.io/projected/e0428da9-4f94-4297-b639-c8b777b1d216-kube-api-access-rqtc6\") pod \"glance-db-sync-dcw47\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " pod="openstack/glance-db-sync-dcw47" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.498049 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-db-sync-config-data\") pod \"glance-db-sync-dcw47\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " pod="openstack/glance-db-sync-dcw47" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.498123 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-config-data\") pod \"glance-db-sync-dcw47\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " pod="openstack/glance-db-sync-dcw47" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.498185 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-combined-ca-bundle\") pod \"glance-db-sync-dcw47\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " pod="openstack/glance-db-sync-dcw47" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.498220 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqtc6\" (UniqueName: \"kubernetes.io/projected/e0428da9-4f94-4297-b639-c8b777b1d216-kube-api-access-rqtc6\") pod \"glance-db-sync-dcw47\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " pod="openstack/glance-db-sync-dcw47" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.504888 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-config-data\") pod \"glance-db-sync-dcw47\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " pod="openstack/glance-db-sync-dcw47" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.504956 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-combined-ca-bundle\") pod \"glance-db-sync-dcw47\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " pod="openstack/glance-db-sync-dcw47" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.505559 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-db-sync-config-data\") pod \"glance-db-sync-dcw47\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " pod="openstack/glance-db-sync-dcw47" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.514174 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqtc6\" (UniqueName: \"kubernetes.io/projected/e0428da9-4f94-4297-b639-c8b777b1d216-kube-api-access-rqtc6\") pod \"glance-db-sync-dcw47\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " pod="openstack/glance-db-sync-dcw47" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.565814 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52e86a8-cd54-4dc9-8920-9c1f53bf86c1" path="/var/lib/kubelet/pods/a52e86a8-cd54-4dc9-8920-9c1f53bf86c1/volumes" Oct 08 22:41:49 crc kubenswrapper[4834]: I1008 22:41:49.629868 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dcw47" Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.128706 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dcw47"] Oct 08 22:41:50 crc kubenswrapper[4834]: W1008 22:41:50.140318 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0428da9_4f94_4297_b639_c8b777b1d216.slice/crio-1abe14f0728ee7d29d9a50e8ae18b5922de6b3120cee1f14d81cefd509d55079 WatchSource:0}: Error finding container 1abe14f0728ee7d29d9a50e8ae18b5922de6b3120cee1f14d81cefd509d55079: Status 404 returned error can't find the container with id 1abe14f0728ee7d29d9a50e8ae18b5922de6b3120cee1f14d81cefd509d55079 Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.271770 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.287896 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dcw47" event={"ID":"e0428da9-4f94-4297-b639-c8b777b1d216","Type":"ContainerStarted","Data":"1abe14f0728ee7d29d9a50e8ae18b5922de6b3120cee1f14d81cefd509d55079"} Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.553633 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-pqv9l"] Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.555585 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pqv9l" Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.562679 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-pqv9l"] Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.615071 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4jl7\" (UniqueName: \"kubernetes.io/projected/4d83d01f-362f-463f-b837-8d39418f3abf-kube-api-access-p4jl7\") pod \"cinder-db-create-pqv9l\" (UID: \"4d83d01f-362f-463f-b837-8d39418f3abf\") " pod="openstack/cinder-db-create-pqv9l" Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.649312 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.651758 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-f9gml"] Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.653191 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f9gml" Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.668974 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f9gml"] Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.716590 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h49fc\" (UniqueName: \"kubernetes.io/projected/287d1d4a-d93e-4866-89c5-72b876734d9e-kube-api-access-h49fc\") pod \"barbican-db-create-f9gml\" (UID: \"287d1d4a-d93e-4866-89c5-72b876734d9e\") " pod="openstack/barbican-db-create-f9gml" Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.716830 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4jl7\" (UniqueName: \"kubernetes.io/projected/4d83d01f-362f-463f-b837-8d39418f3abf-kube-api-access-p4jl7\") pod \"cinder-db-create-pqv9l\" (UID: \"4d83d01f-362f-463f-b837-8d39418f3abf\") " pod="openstack/cinder-db-create-pqv9l" Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.763412 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4jl7\" (UniqueName: \"kubernetes.io/projected/4d83d01f-362f-463f-b837-8d39418f3abf-kube-api-access-p4jl7\") pod \"cinder-db-create-pqv9l\" (UID: \"4d83d01f-362f-463f-b837-8d39418f3abf\") " pod="openstack/cinder-db-create-pqv9l" Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.818367 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h49fc\" (UniqueName: \"kubernetes.io/projected/287d1d4a-d93e-4866-89c5-72b876734d9e-kube-api-access-h49fc\") pod \"barbican-db-create-f9gml\" (UID: \"287d1d4a-d93e-4866-89c5-72b876734d9e\") " pod="openstack/barbican-db-create-f9gml" Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.837911 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h49fc\" (UniqueName: \"kubernetes.io/projected/287d1d4a-d93e-4866-89c5-72b876734d9e-kube-api-access-h49fc\") pod \"barbican-db-create-f9gml\" (UID: \"287d1d4a-d93e-4866-89c5-72b876734d9e\") " pod="openstack/barbican-db-create-f9gml" Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.855537 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-zfddn"] Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.856636 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zfddn" Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.883805 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zfddn"] Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.884022 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pqv9l" Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.920196 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvz85\" (UniqueName: \"kubernetes.io/projected/c8ecacbd-3766-4d66-a888-a0bed940192d-kube-api-access-kvz85\") pod \"neutron-db-create-zfddn\" (UID: \"c8ecacbd-3766-4d66-a888-a0bed940192d\") " pod="openstack/neutron-db-create-zfddn" Oct 08 22:41:50 crc kubenswrapper[4834]: I1008 22:41:50.971853 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f9gml" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.021601 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvz85\" (UniqueName: \"kubernetes.io/projected/c8ecacbd-3766-4d66-a888-a0bed940192d-kube-api-access-kvz85\") pod \"neutron-db-create-zfddn\" (UID: \"c8ecacbd-3766-4d66-a888-a0bed940192d\") " pod="openstack/neutron-db-create-zfddn" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.057329 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-7bmqf"] Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.058436 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7bmqf" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.062241 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.062539 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-c58dz" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.062803 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.104348 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.104458 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvz85\" (UniqueName: \"kubernetes.io/projected/c8ecacbd-3766-4d66-a888-a0bed940192d-kube-api-access-kvz85\") pod \"neutron-db-create-zfddn\" (UID: \"c8ecacbd-3766-4d66-a888-a0bed940192d\") " pod="openstack/neutron-db-create-zfddn" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.126057 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f6b5b1-dc89-433a-987a-5c122cfcd241-config-data\") pod \"keystone-db-sync-7bmqf\" (UID: \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\") " pod="openstack/keystone-db-sync-7bmqf" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.126351 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxmg9\" (UniqueName: \"kubernetes.io/projected/c7f6b5b1-dc89-433a-987a-5c122cfcd241-kube-api-access-gxmg9\") pod \"keystone-db-sync-7bmqf\" (UID: \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\") " pod="openstack/keystone-db-sync-7bmqf" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.126394 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7f6b5b1-dc89-433a-987a-5c122cfcd241-combined-ca-bundle\") pod \"keystone-db-sync-7bmqf\" (UID: \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\") " pod="openstack/keystone-db-sync-7bmqf" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.163284 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7bmqf"] Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.197734 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zfddn" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.229324 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxmg9\" (UniqueName: \"kubernetes.io/projected/c7f6b5b1-dc89-433a-987a-5c122cfcd241-kube-api-access-gxmg9\") pod \"keystone-db-sync-7bmqf\" (UID: \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\") " pod="openstack/keystone-db-sync-7bmqf" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.229368 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7f6b5b1-dc89-433a-987a-5c122cfcd241-combined-ca-bundle\") pod \"keystone-db-sync-7bmqf\" (UID: \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\") " pod="openstack/keystone-db-sync-7bmqf" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.229427 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f6b5b1-dc89-433a-987a-5c122cfcd241-config-data\") pod \"keystone-db-sync-7bmqf\" (UID: \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\") " pod="openstack/keystone-db-sync-7bmqf" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.235184 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7f6b5b1-dc89-433a-987a-5c122cfcd241-combined-ca-bundle\") pod \"keystone-db-sync-7bmqf\" (UID: \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\") " pod="openstack/keystone-db-sync-7bmqf" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.236231 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f6b5b1-dc89-433a-987a-5c122cfcd241-config-data\") pod \"keystone-db-sync-7bmqf\" (UID: \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\") " pod="openstack/keystone-db-sync-7bmqf" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.245565 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxmg9\" (UniqueName: \"kubernetes.io/projected/c7f6b5b1-dc89-433a-987a-5c122cfcd241-kube-api-access-gxmg9\") pod \"keystone-db-sync-7bmqf\" (UID: \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\") " pod="openstack/keystone-db-sync-7bmqf" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.351052 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f9gml"] Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.433388 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7bmqf" Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.449033 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-pqv9l"] Oct 08 22:41:51 crc kubenswrapper[4834]: W1008 22:41:51.476902 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d83d01f_362f_463f_b837_8d39418f3abf.slice/crio-6d44df78e966ab23eed0d65b8aa07d7915b5c6326e067bfe9236a56228a52b27 WatchSource:0}: Error finding container 6d44df78e966ab23eed0d65b8aa07d7915b5c6326e067bfe9236a56228a52b27: Status 404 returned error can't find the container with id 6d44df78e966ab23eed0d65b8aa07d7915b5c6326e067bfe9236a56228a52b27 Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.673884 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zfddn"] Oct 08 22:41:51 crc kubenswrapper[4834]: I1008 22:41:51.913852 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7bmqf"] Oct 08 22:41:51 crc kubenswrapper[4834]: W1008 22:41:51.921740 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7f6b5b1_dc89_433a_987a_5c122cfcd241.slice/crio-44c72c1e544e9ab30418d1c933bb278430a73a42fd258dc6ff538f0eecdbfd66 WatchSource:0}: Error finding container 44c72c1e544e9ab30418d1c933bb278430a73a42fd258dc6ff538f0eecdbfd66: Status 404 returned error can't find the container with id 44c72c1e544e9ab30418d1c933bb278430a73a42fd258dc6ff538f0eecdbfd66 Oct 08 22:41:52 crc kubenswrapper[4834]: I1008 22:41:52.324227 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7bmqf" event={"ID":"c7f6b5b1-dc89-433a-987a-5c122cfcd241","Type":"ContainerStarted","Data":"44c72c1e544e9ab30418d1c933bb278430a73a42fd258dc6ff538f0eecdbfd66"} Oct 08 22:41:52 crc kubenswrapper[4834]: I1008 22:41:52.327612 4834 generic.go:334] "Generic (PLEG): container finished" podID="c8ecacbd-3766-4d66-a888-a0bed940192d" containerID="37af688ed90d76271eb63e211669328e52636154c9cb1ea6ab93a0c2ac6a207b" exitCode=0 Oct 08 22:41:52 crc kubenswrapper[4834]: I1008 22:41:52.328633 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zfddn" event={"ID":"c8ecacbd-3766-4d66-a888-a0bed940192d","Type":"ContainerDied","Data":"37af688ed90d76271eb63e211669328e52636154c9cb1ea6ab93a0c2ac6a207b"} Oct 08 22:41:52 crc kubenswrapper[4834]: I1008 22:41:52.328688 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zfddn" event={"ID":"c8ecacbd-3766-4d66-a888-a0bed940192d","Type":"ContainerStarted","Data":"2e4c497e7818f5d4f04983fbc29a904f2829ca7208a32f5f008ee19044c2fd79"} Oct 08 22:41:52 crc kubenswrapper[4834]: I1008 22:41:52.332239 4834 generic.go:334] "Generic (PLEG): container finished" podID="287d1d4a-d93e-4866-89c5-72b876734d9e" containerID="e4462766e55e41ac2f173aa20c7c41f1e35a51ebc38de144712b6577aab16b7d" exitCode=0 Oct 08 22:41:52 crc kubenswrapper[4834]: I1008 22:41:52.332382 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f9gml" event={"ID":"287d1d4a-d93e-4866-89c5-72b876734d9e","Type":"ContainerDied","Data":"e4462766e55e41ac2f173aa20c7c41f1e35a51ebc38de144712b6577aab16b7d"} Oct 08 22:41:52 crc kubenswrapper[4834]: I1008 22:41:52.332413 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f9gml" event={"ID":"287d1d4a-d93e-4866-89c5-72b876734d9e","Type":"ContainerStarted","Data":"724e28097b5e364c3559167fe601350c10066bc1bc7a964c8f017cde8a872b58"} Oct 08 22:41:52 crc kubenswrapper[4834]: I1008 22:41:52.334905 4834 generic.go:334] "Generic (PLEG): container finished" podID="4d83d01f-362f-463f-b837-8d39418f3abf" containerID="95ac071ead20f06847c957bb59d09b617365f2bd85c8f5567ed056724b25d3f8" exitCode=0 Oct 08 22:41:52 crc kubenswrapper[4834]: I1008 22:41:52.334947 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pqv9l" event={"ID":"4d83d01f-362f-463f-b837-8d39418f3abf","Type":"ContainerDied","Data":"95ac071ead20f06847c957bb59d09b617365f2bd85c8f5567ed056724b25d3f8"} Oct 08 22:41:52 crc kubenswrapper[4834]: I1008 22:41:52.334972 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pqv9l" event={"ID":"4d83d01f-362f-463f-b837-8d39418f3abf","Type":"ContainerStarted","Data":"6d44df78e966ab23eed0d65b8aa07d7915b5c6326e067bfe9236a56228a52b27"} Oct 08 22:41:53 crc kubenswrapper[4834]: I1008 22:41:53.709902 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zfddn" Oct 08 22:41:53 crc kubenswrapper[4834]: I1008 22:41:53.846161 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f9gml" Oct 08 22:41:53 crc kubenswrapper[4834]: I1008 22:41:53.851169 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pqv9l" Oct 08 22:41:53 crc kubenswrapper[4834]: I1008 22:41:53.881948 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvz85\" (UniqueName: \"kubernetes.io/projected/c8ecacbd-3766-4d66-a888-a0bed940192d-kube-api-access-kvz85\") pod \"c8ecacbd-3766-4d66-a888-a0bed940192d\" (UID: \"c8ecacbd-3766-4d66-a888-a0bed940192d\") " Oct 08 22:41:53 crc kubenswrapper[4834]: I1008 22:41:53.889423 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8ecacbd-3766-4d66-a888-a0bed940192d-kube-api-access-kvz85" (OuterVolumeSpecName: "kube-api-access-kvz85") pod "c8ecacbd-3766-4d66-a888-a0bed940192d" (UID: "c8ecacbd-3766-4d66-a888-a0bed940192d"). InnerVolumeSpecName "kube-api-access-kvz85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:53 crc kubenswrapper[4834]: I1008 22:41:53.983100 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h49fc\" (UniqueName: \"kubernetes.io/projected/287d1d4a-d93e-4866-89c5-72b876734d9e-kube-api-access-h49fc\") pod \"287d1d4a-d93e-4866-89c5-72b876734d9e\" (UID: \"287d1d4a-d93e-4866-89c5-72b876734d9e\") " Oct 08 22:41:53 crc kubenswrapper[4834]: I1008 22:41:53.983427 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4jl7\" (UniqueName: \"kubernetes.io/projected/4d83d01f-362f-463f-b837-8d39418f3abf-kube-api-access-p4jl7\") pod \"4d83d01f-362f-463f-b837-8d39418f3abf\" (UID: \"4d83d01f-362f-463f-b837-8d39418f3abf\") " Oct 08 22:41:53 crc kubenswrapper[4834]: I1008 22:41:53.984053 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvz85\" (UniqueName: \"kubernetes.io/projected/c8ecacbd-3766-4d66-a888-a0bed940192d-kube-api-access-kvz85\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:53 crc kubenswrapper[4834]: I1008 22:41:53.987329 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/287d1d4a-d93e-4866-89c5-72b876734d9e-kube-api-access-h49fc" (OuterVolumeSpecName: "kube-api-access-h49fc") pod "287d1d4a-d93e-4866-89c5-72b876734d9e" (UID: "287d1d4a-d93e-4866-89c5-72b876734d9e"). InnerVolumeSpecName "kube-api-access-h49fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:53 crc kubenswrapper[4834]: I1008 22:41:53.987812 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d83d01f-362f-463f-b837-8d39418f3abf-kube-api-access-p4jl7" (OuterVolumeSpecName: "kube-api-access-p4jl7") pod "4d83d01f-362f-463f-b837-8d39418f3abf" (UID: "4d83d01f-362f-463f-b837-8d39418f3abf"). InnerVolumeSpecName "kube-api-access-p4jl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:41:54 crc kubenswrapper[4834]: I1008 22:41:54.088492 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4jl7\" (UniqueName: \"kubernetes.io/projected/4d83d01f-362f-463f-b837-8d39418f3abf-kube-api-access-p4jl7\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:54 crc kubenswrapper[4834]: I1008 22:41:54.088525 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h49fc\" (UniqueName: \"kubernetes.io/projected/287d1d4a-d93e-4866-89c5-72b876734d9e-kube-api-access-h49fc\") on node \"crc\" DevicePath \"\"" Oct 08 22:41:54 crc kubenswrapper[4834]: I1008 22:41:54.352164 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zfddn" Oct 08 22:41:54 crc kubenswrapper[4834]: I1008 22:41:54.352163 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zfddn" event={"ID":"c8ecacbd-3766-4d66-a888-a0bed940192d","Type":"ContainerDied","Data":"2e4c497e7818f5d4f04983fbc29a904f2829ca7208a32f5f008ee19044c2fd79"} Oct 08 22:41:54 crc kubenswrapper[4834]: I1008 22:41:54.352362 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e4c497e7818f5d4f04983fbc29a904f2829ca7208a32f5f008ee19044c2fd79" Oct 08 22:41:54 crc kubenswrapper[4834]: I1008 22:41:54.354179 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f9gml" Oct 08 22:41:54 crc kubenswrapper[4834]: I1008 22:41:54.354259 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f9gml" event={"ID":"287d1d4a-d93e-4866-89c5-72b876734d9e","Type":"ContainerDied","Data":"724e28097b5e364c3559167fe601350c10066bc1bc7a964c8f017cde8a872b58"} Oct 08 22:41:54 crc kubenswrapper[4834]: I1008 22:41:54.354297 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="724e28097b5e364c3559167fe601350c10066bc1bc7a964c8f017cde8a872b58" Oct 08 22:41:54 crc kubenswrapper[4834]: I1008 22:41:54.357933 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pqv9l" event={"ID":"4d83d01f-362f-463f-b837-8d39418f3abf","Type":"ContainerDied","Data":"6d44df78e966ab23eed0d65b8aa07d7915b5c6326e067bfe9236a56228a52b27"} Oct 08 22:41:54 crc kubenswrapper[4834]: I1008 22:41:54.357956 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d44df78e966ab23eed0d65b8aa07d7915b5c6326e067bfe9236a56228a52b27" Oct 08 22:41:54 crc kubenswrapper[4834]: I1008 22:41:54.357999 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pqv9l" Oct 08 22:41:57 crc kubenswrapper[4834]: I1008 22:41:57.014475 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:41:57 crc kubenswrapper[4834]: I1008 22:41:57.090486 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57f58c7cff-5bxfv"] Oct 08 22:41:57 crc kubenswrapper[4834]: I1008 22:41:57.090817 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" podUID="361e1c9f-8765-484b-a2ef-4b2e3db1af99" containerName="dnsmasq-dns" containerID="cri-o://f0b6250ea0aa62689a7cecc3a1672706ca1cd0027b1e3877b2292f8da2d8ab95" gracePeriod=10 Oct 08 22:41:58 crc kubenswrapper[4834]: I1008 22:41:58.401197 4834 generic.go:334] "Generic (PLEG): container finished" podID="361e1c9f-8765-484b-a2ef-4b2e3db1af99" containerID="f0b6250ea0aa62689a7cecc3a1672706ca1cd0027b1e3877b2292f8da2d8ab95" exitCode=0 Oct 08 22:41:58 crc kubenswrapper[4834]: I1008 22:41:58.401253 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" event={"ID":"361e1c9f-8765-484b-a2ef-4b2e3db1af99","Type":"ContainerDied","Data":"f0b6250ea0aa62689a7cecc3a1672706ca1cd0027b1e3877b2292f8da2d8ab95"} Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.726932 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6272-account-create-dszws"] Oct 08 22:42:00 crc kubenswrapper[4834]: E1008 22:42:00.727306 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ecacbd-3766-4d66-a888-a0bed940192d" containerName="mariadb-database-create" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.727319 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ecacbd-3766-4d66-a888-a0bed940192d" containerName="mariadb-database-create" Oct 08 22:42:00 crc kubenswrapper[4834]: E1008 22:42:00.727332 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d83d01f-362f-463f-b837-8d39418f3abf" containerName="mariadb-database-create" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.727337 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d83d01f-362f-463f-b837-8d39418f3abf" containerName="mariadb-database-create" Oct 08 22:42:00 crc kubenswrapper[4834]: E1008 22:42:00.727362 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="287d1d4a-d93e-4866-89c5-72b876734d9e" containerName="mariadb-database-create" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.727368 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="287d1d4a-d93e-4866-89c5-72b876734d9e" containerName="mariadb-database-create" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.727526 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d83d01f-362f-463f-b837-8d39418f3abf" containerName="mariadb-database-create" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.727541 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="287d1d4a-d93e-4866-89c5-72b876734d9e" containerName="mariadb-database-create" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.727557 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ecacbd-3766-4d66-a888-a0bed940192d" containerName="mariadb-database-create" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.728091 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6272-account-create-dszws" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.730912 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.738498 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6272-account-create-dszws"] Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.809183 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-57d9-account-create-hm5jw"] Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.810644 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-57d9-account-create-hm5jw" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.816211 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.826509 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-57d9-account-create-hm5jw"] Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.843100 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m7xh\" (UniqueName: \"kubernetes.io/projected/eafc6c7f-9e9b-4232-b0dc-82225a78e1d2-kube-api-access-2m7xh\") pod \"cinder-57d9-account-create-hm5jw\" (UID: \"eafc6c7f-9e9b-4232-b0dc-82225a78e1d2\") " pod="openstack/cinder-57d9-account-create-hm5jw" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.843186 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82qsm\" (UniqueName: \"kubernetes.io/projected/26081853-747e-4f7b-af9b-819dc967f807-kube-api-access-82qsm\") pod \"barbican-6272-account-create-dszws\" (UID: \"26081853-747e-4f7b-af9b-819dc967f807\") " pod="openstack/barbican-6272-account-create-dszws" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.944274 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m7xh\" (UniqueName: \"kubernetes.io/projected/eafc6c7f-9e9b-4232-b0dc-82225a78e1d2-kube-api-access-2m7xh\") pod \"cinder-57d9-account-create-hm5jw\" (UID: \"eafc6c7f-9e9b-4232-b0dc-82225a78e1d2\") " pod="openstack/cinder-57d9-account-create-hm5jw" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.944375 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82qsm\" (UniqueName: \"kubernetes.io/projected/26081853-747e-4f7b-af9b-819dc967f807-kube-api-access-82qsm\") pod \"barbican-6272-account-create-dszws\" (UID: \"26081853-747e-4f7b-af9b-819dc967f807\") " pod="openstack/barbican-6272-account-create-dszws" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.962283 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m7xh\" (UniqueName: \"kubernetes.io/projected/eafc6c7f-9e9b-4232-b0dc-82225a78e1d2-kube-api-access-2m7xh\") pod \"cinder-57d9-account-create-hm5jw\" (UID: \"eafc6c7f-9e9b-4232-b0dc-82225a78e1d2\") " pod="openstack/cinder-57d9-account-create-hm5jw" Oct 08 22:42:00 crc kubenswrapper[4834]: I1008 22:42:00.967257 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82qsm\" (UniqueName: \"kubernetes.io/projected/26081853-747e-4f7b-af9b-819dc967f807-kube-api-access-82qsm\") pod \"barbican-6272-account-create-dszws\" (UID: \"26081853-747e-4f7b-af9b-819dc967f807\") " pod="openstack/barbican-6272-account-create-dszws" Oct 08 22:42:01 crc kubenswrapper[4834]: I1008 22:42:01.056722 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6272-account-create-dszws" Oct 08 22:42:01 crc kubenswrapper[4834]: I1008 22:42:01.109181 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-3518-account-create-g28bg"] Oct 08 22:42:01 crc kubenswrapper[4834]: I1008 22:42:01.110985 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3518-account-create-g28bg" Oct 08 22:42:01 crc kubenswrapper[4834]: I1008 22:42:01.113471 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 08 22:42:01 crc kubenswrapper[4834]: I1008 22:42:01.128437 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3518-account-create-g28bg"] Oct 08 22:42:01 crc kubenswrapper[4834]: I1008 22:42:01.134880 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-57d9-account-create-hm5jw" Oct 08 22:42:01 crc kubenswrapper[4834]: I1008 22:42:01.150031 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-992c8\" (UniqueName: \"kubernetes.io/projected/d4a0f324-ef5b-4320-b735-70a1f26376a0-kube-api-access-992c8\") pod \"neutron-3518-account-create-g28bg\" (UID: \"d4a0f324-ef5b-4320-b735-70a1f26376a0\") " pod="openstack/neutron-3518-account-create-g28bg" Oct 08 22:42:01 crc kubenswrapper[4834]: I1008 22:42:01.251457 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-992c8\" (UniqueName: \"kubernetes.io/projected/d4a0f324-ef5b-4320-b735-70a1f26376a0-kube-api-access-992c8\") pod \"neutron-3518-account-create-g28bg\" (UID: \"d4a0f324-ef5b-4320-b735-70a1f26376a0\") " pod="openstack/neutron-3518-account-create-g28bg" Oct 08 22:42:01 crc kubenswrapper[4834]: I1008 22:42:01.273086 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-992c8\" (UniqueName: \"kubernetes.io/projected/d4a0f324-ef5b-4320-b735-70a1f26376a0-kube-api-access-992c8\") pod \"neutron-3518-account-create-g28bg\" (UID: \"d4a0f324-ef5b-4320-b735-70a1f26376a0\") " pod="openstack/neutron-3518-account-create-g28bg" Oct 08 22:42:01 crc kubenswrapper[4834]: I1008 22:42:01.456210 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3518-account-create-g28bg" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.427456 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.481443 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" event={"ID":"361e1c9f-8765-484b-a2ef-4b2e3db1af99","Type":"ContainerDied","Data":"aae7a6c13a1b60aea04e210d940817f4ad395c8f5fdb11d5304714178acf8442"} Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.481495 4834 scope.go:117] "RemoveContainer" containerID="f0b6250ea0aa62689a7cecc3a1672706ca1cd0027b1e3877b2292f8da2d8ab95" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.481543 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.497404 4834 scope.go:117] "RemoveContainer" containerID="21726a9a88b68197f50bd3071c644dee6314754cdcf9f7b3c93e80adfa418676" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.527479 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txgkl\" (UniqueName: \"kubernetes.io/projected/361e1c9f-8765-484b-a2ef-4b2e3db1af99-kube-api-access-txgkl\") pod \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.527605 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-ovsdbserver-sb\") pod \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.527785 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-ovsdbserver-nb\") pod \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.527843 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-config\") pod \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.528003 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-dns-svc\") pod \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\" (UID: \"361e1c9f-8765-484b-a2ef-4b2e3db1af99\") " Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.537331 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361e1c9f-8765-484b-a2ef-4b2e3db1af99-kube-api-access-txgkl" (OuterVolumeSpecName: "kube-api-access-txgkl") pod "361e1c9f-8765-484b-a2ef-4b2e3db1af99" (UID: "361e1c9f-8765-484b-a2ef-4b2e3db1af99"). InnerVolumeSpecName "kube-api-access-txgkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.580864 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "361e1c9f-8765-484b-a2ef-4b2e3db1af99" (UID: "361e1c9f-8765-484b-a2ef-4b2e3db1af99"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.582278 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "361e1c9f-8765-484b-a2ef-4b2e3db1af99" (UID: "361e1c9f-8765-484b-a2ef-4b2e3db1af99"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.582909 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "361e1c9f-8765-484b-a2ef-4b2e3db1af99" (UID: "361e1c9f-8765-484b-a2ef-4b2e3db1af99"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.589989 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-config" (OuterVolumeSpecName: "config") pod "361e1c9f-8765-484b-a2ef-4b2e3db1af99" (UID: "361e1c9f-8765-484b-a2ef-4b2e3db1af99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.630077 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.630114 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.630127 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.630136 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/361e1c9f-8765-484b-a2ef-4b2e3db1af99-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.630159 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txgkl\" (UniqueName: \"kubernetes.io/projected/361e1c9f-8765-484b-a2ef-4b2e3db1af99-kube-api-access-txgkl\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.767573 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-57d9-account-create-hm5jw"] Oct 08 22:42:05 crc kubenswrapper[4834]: W1008 22:42:05.779281 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeafc6c7f_9e9b_4232_b0dc_82225a78e1d2.slice/crio-2e33540fba54265ee57655c9f09e4c979bb73b78987396089abbbe7965a2d8fa WatchSource:0}: Error finding container 2e33540fba54265ee57655c9f09e4c979bb73b78987396089abbbe7965a2d8fa: Status 404 returned error can't find the container with id 2e33540fba54265ee57655c9f09e4c979bb73b78987396089abbbe7965a2d8fa Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.797052 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57f58c7cff-5bxfv" podUID="361e1c9f-8765-484b-a2ef-4b2e3db1af99" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.825472 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57f58c7cff-5bxfv"] Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.831531 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57f58c7cff-5bxfv"] Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.839495 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3518-account-create-g28bg"] Oct 08 22:42:05 crc kubenswrapper[4834]: W1008 22:42:05.843575 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4a0f324_ef5b_4320_b735_70a1f26376a0.slice/crio-1bff8c4c9206e7becfe216bf08aab53da3010f6b19306541e41a243a01d7d9b9 WatchSource:0}: Error finding container 1bff8c4c9206e7becfe216bf08aab53da3010f6b19306541e41a243a01d7d9b9: Status 404 returned error can't find the container with id 1bff8c4c9206e7becfe216bf08aab53da3010f6b19306541e41a243a01d7d9b9 Oct 08 22:42:05 crc kubenswrapper[4834]: W1008 22:42:05.844725 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26081853_747e_4f7b_af9b_819dc967f807.slice/crio-856c08d0e602c759e4a6c367e8b0cbeeb1c2d586f12fa90565d60ff821950858 WatchSource:0}: Error finding container 856c08d0e602c759e4a6c367e8b0cbeeb1c2d586f12fa90565d60ff821950858: Status 404 returned error can't find the container with id 856c08d0e602c759e4a6c367e8b0cbeeb1c2d586f12fa90565d60ff821950858 Oct 08 22:42:05 crc kubenswrapper[4834]: I1008 22:42:05.845258 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6272-account-create-dszws"] Oct 08 22:42:06 crc kubenswrapper[4834]: I1008 22:42:06.492102 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dcw47" event={"ID":"e0428da9-4f94-4297-b639-c8b777b1d216","Type":"ContainerStarted","Data":"9aa7536da54254fbbacb96af2d84bae4ba72329943d5c430446435f27ee30123"} Oct 08 22:42:06 crc kubenswrapper[4834]: I1008 22:42:06.496643 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7bmqf" event={"ID":"c7f6b5b1-dc89-433a-987a-5c122cfcd241","Type":"ContainerStarted","Data":"26138348638c67ebd21b0a53f61f166b58c8148f9fe6ddcb4703f9285102299a"} Oct 08 22:42:06 crc kubenswrapper[4834]: I1008 22:42:06.499262 4834 generic.go:334] "Generic (PLEG): container finished" podID="d4a0f324-ef5b-4320-b735-70a1f26376a0" containerID="4fabbc353f24c6c2298b39e11ea1fd36fa1a517790dad8dfc9cbf3232a5ba1e4" exitCode=0 Oct 08 22:42:06 crc kubenswrapper[4834]: I1008 22:42:06.499312 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3518-account-create-g28bg" event={"ID":"d4a0f324-ef5b-4320-b735-70a1f26376a0","Type":"ContainerDied","Data":"4fabbc353f24c6c2298b39e11ea1fd36fa1a517790dad8dfc9cbf3232a5ba1e4"} Oct 08 22:42:06 crc kubenswrapper[4834]: I1008 22:42:06.499327 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3518-account-create-g28bg" event={"ID":"d4a0f324-ef5b-4320-b735-70a1f26376a0","Type":"ContainerStarted","Data":"1bff8c4c9206e7becfe216bf08aab53da3010f6b19306541e41a243a01d7d9b9"} Oct 08 22:42:06 crc kubenswrapper[4834]: I1008 22:42:06.500618 4834 generic.go:334] "Generic (PLEG): container finished" podID="26081853-747e-4f7b-af9b-819dc967f807" containerID="aafc6f28f6192bb2e21a743fca46de49ec0f69b188ca6747d25a84008f13d7e4" exitCode=0 Oct 08 22:42:06 crc kubenswrapper[4834]: I1008 22:42:06.500696 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6272-account-create-dszws" event={"ID":"26081853-747e-4f7b-af9b-819dc967f807","Type":"ContainerDied","Data":"aafc6f28f6192bb2e21a743fca46de49ec0f69b188ca6747d25a84008f13d7e4"} Oct 08 22:42:06 crc kubenswrapper[4834]: I1008 22:42:06.500709 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6272-account-create-dszws" event={"ID":"26081853-747e-4f7b-af9b-819dc967f807","Type":"ContainerStarted","Data":"856c08d0e602c759e4a6c367e8b0cbeeb1c2d586f12fa90565d60ff821950858"} Oct 08 22:42:06 crc kubenswrapper[4834]: I1008 22:42:06.502304 4834 generic.go:334] "Generic (PLEG): container finished" podID="eafc6c7f-9e9b-4232-b0dc-82225a78e1d2" containerID="af00c748dce9310bfc4c5c0af24bc0e0c9f62dba443f13c4b67ecf0ccef26abf" exitCode=0 Oct 08 22:42:06 crc kubenswrapper[4834]: I1008 22:42:06.502330 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-57d9-account-create-hm5jw" event={"ID":"eafc6c7f-9e9b-4232-b0dc-82225a78e1d2","Type":"ContainerDied","Data":"af00c748dce9310bfc4c5c0af24bc0e0c9f62dba443f13c4b67ecf0ccef26abf"} Oct 08 22:42:06 crc kubenswrapper[4834]: I1008 22:42:06.502343 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-57d9-account-create-hm5jw" event={"ID":"eafc6c7f-9e9b-4232-b0dc-82225a78e1d2","Type":"ContainerStarted","Data":"2e33540fba54265ee57655c9f09e4c979bb73b78987396089abbbe7965a2d8fa"} Oct 08 22:42:06 crc kubenswrapper[4834]: I1008 22:42:06.528767 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-dcw47" podStartSLOduration=2.424894039 podStartE2EDuration="17.528750642s" podCreationTimestamp="2025-10-08 22:41:49 +0000 UTC" firstStartedPulling="2025-10-08 22:41:50.143696641 +0000 UTC m=+1117.966581397" lastFinishedPulling="2025-10-08 22:42:05.247553244 +0000 UTC m=+1133.070438000" observedRunningTime="2025-10-08 22:42:06.522267615 +0000 UTC m=+1134.345152371" watchObservedRunningTime="2025-10-08 22:42:06.528750642 +0000 UTC m=+1134.351635378" Oct 08 22:42:06 crc kubenswrapper[4834]: I1008 22:42:06.571078 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-7bmqf" podStartSLOduration=2.254587963 podStartE2EDuration="15.571059224s" podCreationTimestamp="2025-10-08 22:41:51 +0000 UTC" firstStartedPulling="2025-10-08 22:41:51.924557685 +0000 UTC m=+1119.747442431" lastFinishedPulling="2025-10-08 22:42:05.241028946 +0000 UTC m=+1133.063913692" observedRunningTime="2025-10-08 22:42:06.569828524 +0000 UTC m=+1134.392713270" watchObservedRunningTime="2025-10-08 22:42:06.571059224 +0000 UTC m=+1134.393943970" Oct 08 22:42:07 crc kubenswrapper[4834]: I1008 22:42:07.576489 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="361e1c9f-8765-484b-a2ef-4b2e3db1af99" path="/var/lib/kubelet/pods/361e1c9f-8765-484b-a2ef-4b2e3db1af99/volumes" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:07.970629 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3518-account-create-g28bg" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:07.976414 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6272-account-create-dszws" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:07.980929 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-57d9-account-create-hm5jw" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.074794 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m7xh\" (UniqueName: \"kubernetes.io/projected/eafc6c7f-9e9b-4232-b0dc-82225a78e1d2-kube-api-access-2m7xh\") pod \"eafc6c7f-9e9b-4232-b0dc-82225a78e1d2\" (UID: \"eafc6c7f-9e9b-4232-b0dc-82225a78e1d2\") " Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.075279 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-992c8\" (UniqueName: \"kubernetes.io/projected/d4a0f324-ef5b-4320-b735-70a1f26376a0-kube-api-access-992c8\") pod \"d4a0f324-ef5b-4320-b735-70a1f26376a0\" (UID: \"d4a0f324-ef5b-4320-b735-70a1f26376a0\") " Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.075395 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82qsm\" (UniqueName: \"kubernetes.io/projected/26081853-747e-4f7b-af9b-819dc967f807-kube-api-access-82qsm\") pod \"26081853-747e-4f7b-af9b-819dc967f807\" (UID: \"26081853-747e-4f7b-af9b-819dc967f807\") " Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.081808 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eafc6c7f-9e9b-4232-b0dc-82225a78e1d2-kube-api-access-2m7xh" (OuterVolumeSpecName: "kube-api-access-2m7xh") pod "eafc6c7f-9e9b-4232-b0dc-82225a78e1d2" (UID: "eafc6c7f-9e9b-4232-b0dc-82225a78e1d2"). InnerVolumeSpecName "kube-api-access-2m7xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.081896 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26081853-747e-4f7b-af9b-819dc967f807-kube-api-access-82qsm" (OuterVolumeSpecName: "kube-api-access-82qsm") pod "26081853-747e-4f7b-af9b-819dc967f807" (UID: "26081853-747e-4f7b-af9b-819dc967f807"). InnerVolumeSpecName "kube-api-access-82qsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.095660 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a0f324-ef5b-4320-b735-70a1f26376a0-kube-api-access-992c8" (OuterVolumeSpecName: "kube-api-access-992c8") pod "d4a0f324-ef5b-4320-b735-70a1f26376a0" (UID: "d4a0f324-ef5b-4320-b735-70a1f26376a0"). InnerVolumeSpecName "kube-api-access-992c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.177503 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-992c8\" (UniqueName: \"kubernetes.io/projected/d4a0f324-ef5b-4320-b735-70a1f26376a0-kube-api-access-992c8\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.177539 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82qsm\" (UniqueName: \"kubernetes.io/projected/26081853-747e-4f7b-af9b-819dc967f807-kube-api-access-82qsm\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.177551 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m7xh\" (UniqueName: \"kubernetes.io/projected/eafc6c7f-9e9b-4232-b0dc-82225a78e1d2-kube-api-access-2m7xh\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.537706 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3518-account-create-g28bg" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.537733 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3518-account-create-g28bg" event={"ID":"d4a0f324-ef5b-4320-b735-70a1f26376a0","Type":"ContainerDied","Data":"1bff8c4c9206e7becfe216bf08aab53da3010f6b19306541e41a243a01d7d9b9"} Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.537893 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bff8c4c9206e7becfe216bf08aab53da3010f6b19306541e41a243a01d7d9b9" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.540797 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6272-account-create-dszws" event={"ID":"26081853-747e-4f7b-af9b-819dc967f807","Type":"ContainerDied","Data":"856c08d0e602c759e4a6c367e8b0cbeeb1c2d586f12fa90565d60ff821950858"} Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.540847 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="856c08d0e602c759e4a6c367e8b0cbeeb1c2d586f12fa90565d60ff821950858" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.540865 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6272-account-create-dszws" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.543587 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-57d9-account-create-hm5jw" event={"ID":"eafc6c7f-9e9b-4232-b0dc-82225a78e1d2","Type":"ContainerDied","Data":"2e33540fba54265ee57655c9f09e4c979bb73b78987396089abbbe7965a2d8fa"} Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.543633 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e33540fba54265ee57655c9f09e4c979bb73b78987396089abbbe7965a2d8fa" Oct 08 22:42:08 crc kubenswrapper[4834]: I1008 22:42:08.543741 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-57d9-account-create-hm5jw" Oct 08 22:42:09 crc kubenswrapper[4834]: I1008 22:42:09.561308 4834 generic.go:334] "Generic (PLEG): container finished" podID="c7f6b5b1-dc89-433a-987a-5c122cfcd241" containerID="26138348638c67ebd21b0a53f61f166b58c8148f9fe6ddcb4703f9285102299a" exitCode=0 Oct 08 22:42:09 crc kubenswrapper[4834]: I1008 22:42:09.574296 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7bmqf" event={"ID":"c7f6b5b1-dc89-433a-987a-5c122cfcd241","Type":"ContainerDied","Data":"26138348638c67ebd21b0a53f61f166b58c8148f9fe6ddcb4703f9285102299a"} Oct 08 22:42:10 crc kubenswrapper[4834]: I1008 22:42:10.907562 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7bmqf" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.040467 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7f6b5b1-dc89-433a-987a-5c122cfcd241-combined-ca-bundle\") pod \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\" (UID: \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\") " Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.040826 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f6b5b1-dc89-433a-987a-5c122cfcd241-config-data\") pod \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\" (UID: \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\") " Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.040889 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxmg9\" (UniqueName: \"kubernetes.io/projected/c7f6b5b1-dc89-433a-987a-5c122cfcd241-kube-api-access-gxmg9\") pod \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\" (UID: \"c7f6b5b1-dc89-433a-987a-5c122cfcd241\") " Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.046033 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7f6b5b1-dc89-433a-987a-5c122cfcd241-kube-api-access-gxmg9" (OuterVolumeSpecName: "kube-api-access-gxmg9") pod "c7f6b5b1-dc89-433a-987a-5c122cfcd241" (UID: "c7f6b5b1-dc89-433a-987a-5c122cfcd241"). InnerVolumeSpecName "kube-api-access-gxmg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.068579 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7f6b5b1-dc89-433a-987a-5c122cfcd241-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7f6b5b1-dc89-433a-987a-5c122cfcd241" (UID: "c7f6b5b1-dc89-433a-987a-5c122cfcd241"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.089709 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7f6b5b1-dc89-433a-987a-5c122cfcd241-config-data" (OuterVolumeSpecName: "config-data") pod "c7f6b5b1-dc89-433a-987a-5c122cfcd241" (UID: "c7f6b5b1-dc89-433a-987a-5c122cfcd241"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.144082 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxmg9\" (UniqueName: \"kubernetes.io/projected/c7f6b5b1-dc89-433a-987a-5c122cfcd241-kube-api-access-gxmg9\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.144248 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7f6b5b1-dc89-433a-987a-5c122cfcd241-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.144265 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7f6b5b1-dc89-433a-987a-5c122cfcd241-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.584299 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7bmqf" event={"ID":"c7f6b5b1-dc89-433a-987a-5c122cfcd241","Type":"ContainerDied","Data":"44c72c1e544e9ab30418d1c933bb278430a73a42fd258dc6ff538f0eecdbfd66"} Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.584434 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44c72c1e544e9ab30418d1c933bb278430a73a42fd258dc6ff538f0eecdbfd66" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.584371 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7bmqf" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.865709 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-758b46c4c7-bwrtb"] Oct 08 22:42:11 crc kubenswrapper[4834]: E1008 22:42:11.866521 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a0f324-ef5b-4320-b735-70a1f26376a0" containerName="mariadb-account-create" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.866580 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a0f324-ef5b-4320-b735-70a1f26376a0" containerName="mariadb-account-create" Oct 08 22:42:11 crc kubenswrapper[4834]: E1008 22:42:11.866636 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361e1c9f-8765-484b-a2ef-4b2e3db1af99" containerName="init" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.866690 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="361e1c9f-8765-484b-a2ef-4b2e3db1af99" containerName="init" Oct 08 22:42:11 crc kubenswrapper[4834]: E1008 22:42:11.866753 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f6b5b1-dc89-433a-987a-5c122cfcd241" containerName="keystone-db-sync" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.866797 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f6b5b1-dc89-433a-987a-5c122cfcd241" containerName="keystone-db-sync" Oct 08 22:42:11 crc kubenswrapper[4834]: E1008 22:42:11.866847 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26081853-747e-4f7b-af9b-819dc967f807" containerName="mariadb-account-create" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.866891 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="26081853-747e-4f7b-af9b-819dc967f807" containerName="mariadb-account-create" Oct 08 22:42:11 crc kubenswrapper[4834]: E1008 22:42:11.866941 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eafc6c7f-9e9b-4232-b0dc-82225a78e1d2" containerName="mariadb-account-create" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.866991 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafc6c7f-9e9b-4232-b0dc-82225a78e1d2" containerName="mariadb-account-create" Oct 08 22:42:11 crc kubenswrapper[4834]: E1008 22:42:11.867039 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361e1c9f-8765-484b-a2ef-4b2e3db1af99" containerName="dnsmasq-dns" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.867080 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="361e1c9f-8765-484b-a2ef-4b2e3db1af99" containerName="dnsmasq-dns" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.867299 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="361e1c9f-8765-484b-a2ef-4b2e3db1af99" containerName="dnsmasq-dns" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.867360 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7f6b5b1-dc89-433a-987a-5c122cfcd241" containerName="keystone-db-sync" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.867407 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="eafc6c7f-9e9b-4232-b0dc-82225a78e1d2" containerName="mariadb-account-create" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.867462 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4a0f324-ef5b-4320-b735-70a1f26376a0" containerName="mariadb-account-create" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.867524 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="26081853-747e-4f7b-af9b-819dc967f807" containerName="mariadb-account-create" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.868355 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.895679 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758b46c4c7-bwrtb"] Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.910612 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qxqnd"] Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.912517 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.918487 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.918603 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.920070 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.920336 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-c58dz" Oct 08 22:42:11 crc kubenswrapper[4834]: I1008 22:42:11.944237 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qxqnd"] Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.057297 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-ovsdbserver-sb\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.057424 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-fernet-keys\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.057555 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-dns-svc\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.057633 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-ovsdbserver-nb\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.057697 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-config-data\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.057768 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbppw\" (UniqueName: \"kubernetes.io/projected/53de884f-17e5-4487-a0ac-64d23f12383f-kube-api-access-jbppw\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.057828 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-config\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.057900 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-dns-swift-storage-0\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.057982 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-scripts\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.058059 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-combined-ca-bundle\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.058125 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmlm2\" (UniqueName: \"kubernetes.io/projected/fc1208e7-3873-4aa5-9a30-266dcf1393d7-kube-api-access-rmlm2\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.058212 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-credential-keys\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.095843 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-bdkgz"] Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.097004 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bdkgz" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.101421 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.101647 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-lfnk9" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.101771 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.107878 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.109718 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.115750 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.123811 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.138235 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bdkgz"] Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.147466 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-449pd"] Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.148780 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.152038 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.152333 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.152357 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rls8w" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.164728 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-dns-svc\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.164785 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-ovsdbserver-nb\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.164813 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-config-data\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.164877 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbppw\" (UniqueName: \"kubernetes.io/projected/53de884f-17e5-4487-a0ac-64d23f12383f-kube-api-access-jbppw\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.164909 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-config\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.164948 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-dns-swift-storage-0\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.164991 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-scripts\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.165027 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-combined-ca-bundle\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.165058 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmlm2\" (UniqueName: \"kubernetes.io/projected/fc1208e7-3873-4aa5-9a30-266dcf1393d7-kube-api-access-rmlm2\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.165104 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-credential-keys\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.165182 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-ovsdbserver-sb\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.165219 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-fernet-keys\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.166276 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-dns-swift-storage-0\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.166903 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-dns-svc\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.167022 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.167839 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-ovsdbserver-nb\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.168989 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-config\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.170362 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-ovsdbserver-sb\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.194907 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-scripts\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.194909 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-fernet-keys\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.196240 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-config-data\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.196825 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-credential-keys\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.197118 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-combined-ca-bundle\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.217710 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbppw\" (UniqueName: \"kubernetes.io/projected/53de884f-17e5-4487-a0ac-64d23f12383f-kube-api-access-jbppw\") pod \"keystone-bootstrap-qxqnd\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.244772 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.255243 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmlm2\" (UniqueName: \"kubernetes.io/projected/fc1208e7-3873-4aa5-9a30-266dcf1393d7-kube-api-access-rmlm2\") pod \"dnsmasq-dns-758b46c4c7-bwrtb\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.269629 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-run-httpd\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.270037 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-db-sync-config-data\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.270108 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-log-httpd\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.270140 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kckgb\" (UniqueName: \"kubernetes.io/projected/15a24e03-f3be-433f-bbc1-3a25da713c65-kube-api-access-kckgb\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.270197 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8jhm\" (UniqueName: \"kubernetes.io/projected/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-kube-api-access-r8jhm\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.270246 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-scripts\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.270328 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15a24e03-f3be-433f-bbc1-3a25da713c65-etc-machine-id\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.270354 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgvrz\" (UniqueName: \"kubernetes.io/projected/910a7214-6f0f-452b-adc2-91d1c2589d47-kube-api-access-kgvrz\") pod \"neutron-db-sync-bdkgz\" (UID: \"910a7214-6f0f-452b-adc2-91d1c2589d47\") " pod="openstack/neutron-db-sync-bdkgz" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.270377 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910a7214-6f0f-452b-adc2-91d1c2589d47-combined-ca-bundle\") pod \"neutron-db-sync-bdkgz\" (UID: \"910a7214-6f0f-452b-adc2-91d1c2589d47\") " pod="openstack/neutron-db-sync-bdkgz" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.270409 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/910a7214-6f0f-452b-adc2-91d1c2589d47-config\") pod \"neutron-db-sync-bdkgz\" (UID: \"910a7214-6f0f-452b-adc2-91d1c2589d47\") " pod="openstack/neutron-db-sync-bdkgz" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.270613 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-combined-ca-bundle\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.270791 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.270900 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-config-data\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.270963 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.271133 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-scripts\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.273472 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-config-data\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.332405 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-449pd"] Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.366306 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-mhj54"] Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.368131 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.371275 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.372817 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.373025 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rgzrw" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376317 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-config-data\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376353 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376396 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-scripts\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376411 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-config-data\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376440 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-run-httpd\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376481 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-db-sync-config-data\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376503 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-log-httpd\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376523 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kckgb\" (UniqueName: \"kubernetes.io/projected/15a24e03-f3be-433f-bbc1-3a25da713c65-kube-api-access-kckgb\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376548 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8jhm\" (UniqueName: \"kubernetes.io/projected/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-kube-api-access-r8jhm\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376574 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-scripts\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376595 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15a24e03-f3be-433f-bbc1-3a25da713c65-etc-machine-id\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376615 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgvrz\" (UniqueName: \"kubernetes.io/projected/910a7214-6f0f-452b-adc2-91d1c2589d47-kube-api-access-kgvrz\") pod \"neutron-db-sync-bdkgz\" (UID: \"910a7214-6f0f-452b-adc2-91d1c2589d47\") " pod="openstack/neutron-db-sync-bdkgz" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376632 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910a7214-6f0f-452b-adc2-91d1c2589d47-combined-ca-bundle\") pod \"neutron-db-sync-bdkgz\" (UID: \"910a7214-6f0f-452b-adc2-91d1c2589d47\") " pod="openstack/neutron-db-sync-bdkgz" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376648 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/910a7214-6f0f-452b-adc2-91d1c2589d47-config\") pod \"neutron-db-sync-bdkgz\" (UID: \"910a7214-6f0f-452b-adc2-91d1c2589d47\") " pod="openstack/neutron-db-sync-bdkgz" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376668 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-combined-ca-bundle\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.376687 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.383262 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-config-data\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.386914 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.387564 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910a7214-6f0f-452b-adc2-91d1c2589d47-combined-ca-bundle\") pod \"neutron-db-sync-bdkgz\" (UID: \"910a7214-6f0f-452b-adc2-91d1c2589d47\") " pod="openstack/neutron-db-sync-bdkgz" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.388401 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/910a7214-6f0f-452b-adc2-91d1c2589d47-config\") pod \"neutron-db-sync-bdkgz\" (UID: \"910a7214-6f0f-452b-adc2-91d1c2589d47\") " pod="openstack/neutron-db-sync-bdkgz" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.389375 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15a24e03-f3be-433f-bbc1-3a25da713c65-etc-machine-id\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.389136 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-g2wdt"] Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.391056 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-scripts\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.391518 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-log-httpd\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.392957 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-scripts\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.393321 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-config-data\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.393461 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-db-sync-config-data\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.393650 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-run-httpd\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.394007 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.394158 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-combined-ca-bundle\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.398034 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g2wdt" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.401550 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8jhm\" (UniqueName: \"kubernetes.io/projected/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-kube-api-access-r8jhm\") pod \"ceilometer-0\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.405397 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zsfx5" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.406466 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.408380 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mhj54"] Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.416006 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgvrz\" (UniqueName: \"kubernetes.io/projected/910a7214-6f0f-452b-adc2-91d1c2589d47-kube-api-access-kgvrz\") pod \"neutron-db-sync-bdkgz\" (UID: \"910a7214-6f0f-452b-adc2-91d1c2589d47\") " pod="openstack/neutron-db-sync-bdkgz" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.420167 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kckgb\" (UniqueName: \"kubernetes.io/projected/15a24e03-f3be-433f-bbc1-3a25da713c65-kube-api-access-kckgb\") pod \"cinder-db-sync-449pd\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.420841 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-g2wdt"] Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.425165 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bdkgz" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.432302 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.448706 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b46c4c7-bwrtb"] Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.449734 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.471185 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69894dfcd9-kpp4z"] Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.472800 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.477971 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-config-data\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.478020 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-combined-ca-bundle\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.478054 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-scripts\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.478111 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53c0210a-93d4-4f54-a542-d69c77229b9e-logs\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.478212 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhb9x\" (UniqueName: \"kubernetes.io/projected/53c0210a-93d4-4f54-a542-d69c77229b9e-kube-api-access-zhb9x\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.478323 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d656893-3446-42fe-86ad-74e1b9d7ecd5-combined-ca-bundle\") pod \"barbican-db-sync-g2wdt\" (UID: \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\") " pod="openstack/barbican-db-sync-g2wdt" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.478431 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d656893-3446-42fe-86ad-74e1b9d7ecd5-db-sync-config-data\") pod \"barbican-db-sync-g2wdt\" (UID: \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\") " pod="openstack/barbican-db-sync-g2wdt" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.478530 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b87xv\" (UniqueName: \"kubernetes.io/projected/3d656893-3446-42fe-86ad-74e1b9d7ecd5-kube-api-access-b87xv\") pod \"barbican-db-sync-g2wdt\" (UID: \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\") " pod="openstack/barbican-db-sync-g2wdt" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.485126 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69894dfcd9-kpp4z"] Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.586976 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53c0210a-93d4-4f54-a542-d69c77229b9e-logs\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.587028 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhb9x\" (UniqueName: \"kubernetes.io/projected/53c0210a-93d4-4f54-a542-d69c77229b9e-kube-api-access-zhb9x\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.587052 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-dns-svc\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.587088 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d656893-3446-42fe-86ad-74e1b9d7ecd5-combined-ca-bundle\") pod \"barbican-db-sync-g2wdt\" (UID: \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\") " pod="openstack/barbican-db-sync-g2wdt" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.587123 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d656893-3446-42fe-86ad-74e1b9d7ecd5-db-sync-config-data\") pod \"barbican-db-sync-g2wdt\" (UID: \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\") " pod="openstack/barbican-db-sync-g2wdt" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.587137 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-ovsdbserver-sb\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.587187 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b87xv\" (UniqueName: \"kubernetes.io/projected/3d656893-3446-42fe-86ad-74e1b9d7ecd5-kube-api-access-b87xv\") pod \"barbican-db-sync-g2wdt\" (UID: \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\") " pod="openstack/barbican-db-sync-g2wdt" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.587225 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-ovsdbserver-nb\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.587244 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-dns-swift-storage-0\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.587261 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-config-data\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.587279 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbwbk\" (UniqueName: \"kubernetes.io/projected/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-kube-api-access-zbwbk\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.587300 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-combined-ca-bundle\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.587323 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-config\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.587339 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-scripts\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.591009 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53c0210a-93d4-4f54-a542-d69c77229b9e-logs\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.591524 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-scripts\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.595083 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-449pd" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.601054 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d656893-3446-42fe-86ad-74e1b9d7ecd5-combined-ca-bundle\") pod \"barbican-db-sync-g2wdt\" (UID: \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\") " pod="openstack/barbican-db-sync-g2wdt" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.605887 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d656893-3446-42fe-86ad-74e1b9d7ecd5-db-sync-config-data\") pod \"barbican-db-sync-g2wdt\" (UID: \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\") " pod="openstack/barbican-db-sync-g2wdt" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.612317 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-config-data\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.617728 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-combined-ca-bundle\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.618184 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b87xv\" (UniqueName: \"kubernetes.io/projected/3d656893-3446-42fe-86ad-74e1b9d7ecd5-kube-api-access-b87xv\") pod \"barbican-db-sync-g2wdt\" (UID: \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\") " pod="openstack/barbican-db-sync-g2wdt" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.618236 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhb9x\" (UniqueName: \"kubernetes.io/projected/53c0210a-93d4-4f54-a542-d69c77229b9e-kube-api-access-zhb9x\") pod \"placement-db-sync-mhj54\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.693012 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-config\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.693704 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-dns-svc\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.693760 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-ovsdbserver-sb\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.693810 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-ovsdbserver-nb\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.693829 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-dns-swift-storage-0\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.693852 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbwbk\" (UniqueName: \"kubernetes.io/projected/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-kube-api-access-zbwbk\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.694950 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-config\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.695491 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-dns-svc\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.695955 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-ovsdbserver-nb\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.703664 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-dns-swift-storage-0\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.703824 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mhj54" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.704771 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-ovsdbserver-sb\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.727250 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g2wdt" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.740727 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbwbk\" (UniqueName: \"kubernetes.io/projected/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-kube-api-access-zbwbk\") pod \"dnsmasq-dns-69894dfcd9-kpp4z\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.797706 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:12 crc kubenswrapper[4834]: I1008 22:42:12.876334 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qxqnd"] Oct 08 22:42:13 crc kubenswrapper[4834]: I1008 22:42:13.151091 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bdkgz"] Oct 08 22:42:13 crc kubenswrapper[4834]: W1008 22:42:13.158307 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod910a7214_6f0f_452b_adc2_91d1c2589d47.slice/crio-83a6100e1df923d9a3f912b359ca3541a31605b56844e62a6c02d5feafa5c6f0 WatchSource:0}: Error finding container 83a6100e1df923d9a3f912b359ca3541a31605b56844e62a6c02d5feafa5c6f0: Status 404 returned error can't find the container with id 83a6100e1df923d9a3f912b359ca3541a31605b56844e62a6c02d5feafa5c6f0 Oct 08 22:42:13 crc kubenswrapper[4834]: I1008 22:42:13.373986 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b46c4c7-bwrtb"] Oct 08 22:42:13 crc kubenswrapper[4834]: I1008 22:42:13.395359 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mhj54"] Oct 08 22:42:13 crc kubenswrapper[4834]: I1008 22:42:13.408275 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:42:13 crc kubenswrapper[4834]: I1008 22:42:13.612328 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-449pd"] Oct 08 22:42:13 crc kubenswrapper[4834]: I1008 22:42:13.657650 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a479306-f4e1-49a0-9e1d-4ba54ecedf90","Type":"ContainerStarted","Data":"30f6c464fee03a17a900485019a5cc51474853e708ea0887c1fe8f6c31a22c76"} Oct 08 22:42:13 crc kubenswrapper[4834]: I1008 22:42:13.659688 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mhj54" event={"ID":"53c0210a-93d4-4f54-a542-d69c77229b9e","Type":"ContainerStarted","Data":"bd0c7e56f0ae893606dcf3e6ae1ea8d2a655d462af51b53dc46ef4defb99b418"} Oct 08 22:42:13 crc kubenswrapper[4834]: I1008 22:42:13.660786 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bdkgz" event={"ID":"910a7214-6f0f-452b-adc2-91d1c2589d47","Type":"ContainerStarted","Data":"83a6100e1df923d9a3f912b359ca3541a31605b56844e62a6c02d5feafa5c6f0"} Oct 08 22:42:13 crc kubenswrapper[4834]: I1008 22:42:13.665043 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qxqnd" event={"ID":"53de884f-17e5-4487-a0ac-64d23f12383f","Type":"ContainerStarted","Data":"c1ee0707b352d5c949d1b1752e202629de61e58089c1568cc17419ade6ddee20"} Oct 08 22:42:13 crc kubenswrapper[4834]: I1008 22:42:13.665082 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qxqnd" event={"ID":"53de884f-17e5-4487-a0ac-64d23f12383f","Type":"ContainerStarted","Data":"2926896f48856089a41e90ff800b6b8a2c2e4fc95733216f4081ca515cfe1416"} Oct 08 22:42:13 crc kubenswrapper[4834]: W1008 22:42:13.667324 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d656893_3446_42fe_86ad_74e1b9d7ecd5.slice/crio-3bb8c8a2b1d0e2f962a924181fb3fd0e255c9d81e6f82a1143ede0f82de012b9 WatchSource:0}: Error finding container 3bb8c8a2b1d0e2f962a924181fb3fd0e255c9d81e6f82a1143ede0f82de012b9: Status 404 returned error can't find the container with id 3bb8c8a2b1d0e2f962a924181fb3fd0e255c9d81e6f82a1143ede0f82de012b9 Oct 08 22:42:13 crc kubenswrapper[4834]: I1008 22:42:13.668069 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" event={"ID":"fc1208e7-3873-4aa5-9a30-266dcf1393d7","Type":"ContainerStarted","Data":"89014a0dab921ded4f82e8c207fb9d529864d392c70754c7f6d5cbf4c45d9844"} Oct 08 22:42:13 crc kubenswrapper[4834]: W1008 22:42:13.693524 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5863766_1d50_4ba5_b01e_fbaff8a3dec4.slice/crio-d7abfbb66dbb1acc399a87d5090e9768023ad352ea876421255fb1cc1dbaac63 WatchSource:0}: Error finding container d7abfbb66dbb1acc399a87d5090e9768023ad352ea876421255fb1cc1dbaac63: Status 404 returned error can't find the container with id d7abfbb66dbb1acc399a87d5090e9768023ad352ea876421255fb1cc1dbaac63 Oct 08 22:42:13 crc kubenswrapper[4834]: I1008 22:42:13.703814 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-g2wdt"] Oct 08 22:42:13 crc kubenswrapper[4834]: I1008 22:42:13.725474 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69894dfcd9-kpp4z"] Oct 08 22:42:13 crc kubenswrapper[4834]: I1008 22:42:13.741063 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qxqnd" podStartSLOduration=2.741042372 podStartE2EDuration="2.741042372s" podCreationTimestamp="2025-10-08 22:42:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:42:13.7128132 +0000 UTC m=+1141.535697946" watchObservedRunningTime="2025-10-08 22:42:13.741042372 +0000 UTC m=+1141.563927118" Oct 08 22:42:14 crc kubenswrapper[4834]: I1008 22:42:14.677347 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g2wdt" event={"ID":"3d656893-3446-42fe-86ad-74e1b9d7ecd5","Type":"ContainerStarted","Data":"3bb8c8a2b1d0e2f962a924181fb3fd0e255c9d81e6f82a1143ede0f82de012b9"} Oct 08 22:42:14 crc kubenswrapper[4834]: I1008 22:42:14.680847 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-449pd" event={"ID":"15a24e03-f3be-433f-bbc1-3a25da713c65","Type":"ContainerStarted","Data":"07ae74e4e2b31e588e805946eb9d63d78dd157efad9f6c62d56e966b6884ca67"} Oct 08 22:42:14 crc kubenswrapper[4834]: I1008 22:42:14.682689 4834 generic.go:334] "Generic (PLEG): container finished" podID="c5863766-1d50-4ba5-b01e-fbaff8a3dec4" containerID="3fb7dabff3ee591dafcfd0cfa826cee50db2e71a135715bbc283a782c42e5abd" exitCode=0 Oct 08 22:42:14 crc kubenswrapper[4834]: I1008 22:42:14.682772 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" event={"ID":"c5863766-1d50-4ba5-b01e-fbaff8a3dec4","Type":"ContainerDied","Data":"3fb7dabff3ee591dafcfd0cfa826cee50db2e71a135715bbc283a782c42e5abd"} Oct 08 22:42:14 crc kubenswrapper[4834]: I1008 22:42:14.682868 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" event={"ID":"c5863766-1d50-4ba5-b01e-fbaff8a3dec4","Type":"ContainerStarted","Data":"d7abfbb66dbb1acc399a87d5090e9768023ad352ea876421255fb1cc1dbaac63"} Oct 08 22:42:14 crc kubenswrapper[4834]: I1008 22:42:14.688885 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bdkgz" event={"ID":"910a7214-6f0f-452b-adc2-91d1c2589d47","Type":"ContainerStarted","Data":"299426ab604bd850b419cb8a481e6164770f8d6cf6ebc2e23b67097f996fed1d"} Oct 08 22:42:14 crc kubenswrapper[4834]: I1008 22:42:14.717846 4834 generic.go:334] "Generic (PLEG): container finished" podID="fc1208e7-3873-4aa5-9a30-266dcf1393d7" containerID="cfc56d7f8a0bbe2b04dabfe4acb0c8164de51eef34d01dce79bdea0f43ecd603" exitCode=0 Oct 08 22:42:14 crc kubenswrapper[4834]: I1008 22:42:14.719435 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" event={"ID":"fc1208e7-3873-4aa5-9a30-266dcf1393d7","Type":"ContainerDied","Data":"cfc56d7f8a0bbe2b04dabfe4acb0c8164de51eef34d01dce79bdea0f43ecd603"} Oct 08 22:42:14 crc kubenswrapper[4834]: I1008 22:42:14.745122 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-bdkgz" podStartSLOduration=2.745104978 podStartE2EDuration="2.745104978s" podCreationTimestamp="2025-10-08 22:42:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:42:14.737840822 +0000 UTC m=+1142.560725568" watchObservedRunningTime="2025-10-08 22:42:14.745104978 +0000 UTC m=+1142.567989724" Oct 08 22:42:15 crc kubenswrapper[4834]: I1008 22:42:15.817788 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:42:16 crc kubenswrapper[4834]: I1008 22:42:16.984109 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.026026 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.026095 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.105996 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-ovsdbserver-sb\") pod \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.106077 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmlm2\" (UniqueName: \"kubernetes.io/projected/fc1208e7-3873-4aa5-9a30-266dcf1393d7-kube-api-access-rmlm2\") pod \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.106171 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-ovsdbserver-nb\") pod \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.106216 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-dns-swift-storage-0\") pod \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.106403 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-dns-svc\") pod \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.106483 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-config\") pod \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\" (UID: \"fc1208e7-3873-4aa5-9a30-266dcf1393d7\") " Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.115067 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc1208e7-3873-4aa5-9a30-266dcf1393d7-kube-api-access-rmlm2" (OuterVolumeSpecName: "kube-api-access-rmlm2") pod "fc1208e7-3873-4aa5-9a30-266dcf1393d7" (UID: "fc1208e7-3873-4aa5-9a30-266dcf1393d7"). InnerVolumeSpecName "kube-api-access-rmlm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.140038 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-config" (OuterVolumeSpecName: "config") pod "fc1208e7-3873-4aa5-9a30-266dcf1393d7" (UID: "fc1208e7-3873-4aa5-9a30-266dcf1393d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.142814 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc1208e7-3873-4aa5-9a30-266dcf1393d7" (UID: "fc1208e7-3873-4aa5-9a30-266dcf1393d7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.159183 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fc1208e7-3873-4aa5-9a30-266dcf1393d7" (UID: "fc1208e7-3873-4aa5-9a30-266dcf1393d7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.172105 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc1208e7-3873-4aa5-9a30-266dcf1393d7" (UID: "fc1208e7-3873-4aa5-9a30-266dcf1393d7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.176366 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc1208e7-3873-4aa5-9a30-266dcf1393d7" (UID: "fc1208e7-3873-4aa5-9a30-266dcf1393d7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.208807 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.209024 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.209112 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.209234 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmlm2\" (UniqueName: \"kubernetes.io/projected/fc1208e7-3873-4aa5-9a30-266dcf1393d7-kube-api-access-rmlm2\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.209316 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.209385 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc1208e7-3873-4aa5-9a30-266dcf1393d7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.764726 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" event={"ID":"c5863766-1d50-4ba5-b01e-fbaff8a3dec4","Type":"ContainerStarted","Data":"66bb14da89ca8710f611a5d4b30a7abb9586aa3a01ad78b966a65b229a7abd61"} Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.765868 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.768236 4834 generic.go:334] "Generic (PLEG): container finished" podID="e0428da9-4f94-4297-b639-c8b777b1d216" containerID="9aa7536da54254fbbacb96af2d84bae4ba72329943d5c430446435f27ee30123" exitCode=0 Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.768281 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dcw47" event={"ID":"e0428da9-4f94-4297-b639-c8b777b1d216","Type":"ContainerDied","Data":"9aa7536da54254fbbacb96af2d84bae4ba72329943d5c430446435f27ee30123"} Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.771119 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" event={"ID":"fc1208e7-3873-4aa5-9a30-266dcf1393d7","Type":"ContainerDied","Data":"89014a0dab921ded4f82e8c207fb9d529864d392c70754c7f6d5cbf4c45d9844"} Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.771223 4834 scope.go:117] "RemoveContainer" containerID="cfc56d7f8a0bbe2b04dabfe4acb0c8164de51eef34d01dce79bdea0f43ecd603" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.772133 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b46c4c7-bwrtb" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.791781 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" podStartSLOduration=5.791747677 podStartE2EDuration="5.791747677s" podCreationTimestamp="2025-10-08 22:42:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:42:17.786172102 +0000 UTC m=+1145.609056848" watchObservedRunningTime="2025-10-08 22:42:17.791747677 +0000 UTC m=+1145.614632433" Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.924670 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b46c4c7-bwrtb"] Oct 08 22:42:17 crc kubenswrapper[4834]: I1008 22:42:17.936170 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-758b46c4c7-bwrtb"] Oct 08 22:42:18 crc kubenswrapper[4834]: I1008 22:42:18.781684 4834 generic.go:334] "Generic (PLEG): container finished" podID="53de884f-17e5-4487-a0ac-64d23f12383f" containerID="c1ee0707b352d5c949d1b1752e202629de61e58089c1568cc17419ade6ddee20" exitCode=0 Oct 08 22:42:18 crc kubenswrapper[4834]: I1008 22:42:18.781756 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qxqnd" event={"ID":"53de884f-17e5-4487-a0ac-64d23f12383f","Type":"ContainerDied","Data":"c1ee0707b352d5c949d1b1752e202629de61e58089c1568cc17419ade6ddee20"} Oct 08 22:42:19 crc kubenswrapper[4834]: I1008 22:42:19.571755 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc1208e7-3873-4aa5-9a30-266dcf1393d7" path="/var/lib/kubelet/pods/fc1208e7-3873-4aa5-9a30-266dcf1393d7/volumes" Oct 08 22:42:20 crc kubenswrapper[4834]: I1008 22:42:20.933924 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dcw47" Oct 08 22:42:20 crc kubenswrapper[4834]: I1008 22:42:20.938706 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:20 crc kubenswrapper[4834]: I1008 22:42:20.985248 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-fernet-keys\") pod \"53de884f-17e5-4487-a0ac-64d23f12383f\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " Oct 08 22:42:20 crc kubenswrapper[4834]: I1008 22:42:20.985322 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqtc6\" (UniqueName: \"kubernetes.io/projected/e0428da9-4f94-4297-b639-c8b777b1d216-kube-api-access-rqtc6\") pod \"e0428da9-4f94-4297-b639-c8b777b1d216\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " Oct 08 22:42:20 crc kubenswrapper[4834]: I1008 22:42:20.986491 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-config-data\") pod \"53de884f-17e5-4487-a0ac-64d23f12383f\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " Oct 08 22:42:20 crc kubenswrapper[4834]: I1008 22:42:20.986577 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-scripts\") pod \"53de884f-17e5-4487-a0ac-64d23f12383f\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " Oct 08 22:42:20 crc kubenswrapper[4834]: I1008 22:42:20.986610 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-config-data\") pod \"e0428da9-4f94-4297-b639-c8b777b1d216\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " Oct 08 22:42:20 crc kubenswrapper[4834]: I1008 22:42:20.986686 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-db-sync-config-data\") pod \"e0428da9-4f94-4297-b639-c8b777b1d216\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " Oct 08 22:42:20 crc kubenswrapper[4834]: I1008 22:42:20.986719 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbppw\" (UniqueName: \"kubernetes.io/projected/53de884f-17e5-4487-a0ac-64d23f12383f-kube-api-access-jbppw\") pod \"53de884f-17e5-4487-a0ac-64d23f12383f\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " Oct 08 22:42:20 crc kubenswrapper[4834]: I1008 22:42:20.986754 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-credential-keys\") pod \"53de884f-17e5-4487-a0ac-64d23f12383f\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " Oct 08 22:42:20 crc kubenswrapper[4834]: I1008 22:42:20.986782 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-combined-ca-bundle\") pod \"53de884f-17e5-4487-a0ac-64d23f12383f\" (UID: \"53de884f-17e5-4487-a0ac-64d23f12383f\") " Oct 08 22:42:20 crc kubenswrapper[4834]: I1008 22:42:20.986840 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-combined-ca-bundle\") pod \"e0428da9-4f94-4297-b639-c8b777b1d216\" (UID: \"e0428da9-4f94-4297-b639-c8b777b1d216\") " Oct 08 22:42:20 crc kubenswrapper[4834]: I1008 22:42:20.996731 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53de884f-17e5-4487-a0ac-64d23f12383f-kube-api-access-jbppw" (OuterVolumeSpecName: "kube-api-access-jbppw") pod "53de884f-17e5-4487-a0ac-64d23f12383f" (UID: "53de884f-17e5-4487-a0ac-64d23f12383f"). InnerVolumeSpecName "kube-api-access-jbppw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.000522 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "53de884f-17e5-4487-a0ac-64d23f12383f" (UID: "53de884f-17e5-4487-a0ac-64d23f12383f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.004095 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e0428da9-4f94-4297-b639-c8b777b1d216" (UID: "e0428da9-4f94-4297-b639-c8b777b1d216"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.004441 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "53de884f-17e5-4487-a0ac-64d23f12383f" (UID: "53de884f-17e5-4487-a0ac-64d23f12383f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.006985 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0428da9-4f94-4297-b639-c8b777b1d216-kube-api-access-rqtc6" (OuterVolumeSpecName: "kube-api-access-rqtc6") pod "e0428da9-4f94-4297-b639-c8b777b1d216" (UID: "e0428da9-4f94-4297-b639-c8b777b1d216"). InnerVolumeSpecName "kube-api-access-rqtc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.015732 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-scripts" (OuterVolumeSpecName: "scripts") pod "53de884f-17e5-4487-a0ac-64d23f12383f" (UID: "53de884f-17e5-4487-a0ac-64d23f12383f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.018523 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0428da9-4f94-4297-b639-c8b777b1d216" (UID: "e0428da9-4f94-4297-b639-c8b777b1d216"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.032661 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53de884f-17e5-4487-a0ac-64d23f12383f" (UID: "53de884f-17e5-4487-a0ac-64d23f12383f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.051396 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-config-data" (OuterVolumeSpecName: "config-data") pod "53de884f-17e5-4487-a0ac-64d23f12383f" (UID: "53de884f-17e5-4487-a0ac-64d23f12383f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.063968 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-config-data" (OuterVolumeSpecName: "config-data") pod "e0428da9-4f94-4297-b639-c8b777b1d216" (UID: "e0428da9-4f94-4297-b639-c8b777b1d216"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.096797 4834 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.096834 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqtc6\" (UniqueName: \"kubernetes.io/projected/e0428da9-4f94-4297-b639-c8b777b1d216-kube-api-access-rqtc6\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.096849 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.096860 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.096870 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.096882 4834 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.096893 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbppw\" (UniqueName: \"kubernetes.io/projected/53de884f-17e5-4487-a0ac-64d23f12383f-kube-api-access-jbppw\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.096903 4834 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.096914 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53de884f-17e5-4487-a0ac-64d23f12383f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.096948 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0428da9-4f94-4297-b639-c8b777b1d216-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.814308 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dcw47" event={"ID":"e0428da9-4f94-4297-b639-c8b777b1d216","Type":"ContainerDied","Data":"1abe14f0728ee7d29d9a50e8ae18b5922de6b3120cee1f14d81cefd509d55079"} Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.814729 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1abe14f0728ee7d29d9a50e8ae18b5922de6b3120cee1f14d81cefd509d55079" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.814393 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dcw47" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.816962 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qxqnd" event={"ID":"53de884f-17e5-4487-a0ac-64d23f12383f","Type":"ContainerDied","Data":"2926896f48856089a41e90ff800b6b8a2c2e4fc95733216f4081ca515cfe1416"} Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.817014 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2926896f48856089a41e90ff800b6b8a2c2e4fc95733216f4081ca515cfe1416" Oct 08 22:42:21 crc kubenswrapper[4834]: I1008 22:42:21.817023 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qxqnd" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.128411 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qxqnd"] Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.135368 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qxqnd"] Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.242849 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8rqtg"] Oct 08 22:42:22 crc kubenswrapper[4834]: E1008 22:42:22.246299 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1208e7-3873-4aa5-9a30-266dcf1393d7" containerName="init" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.246341 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1208e7-3873-4aa5-9a30-266dcf1393d7" containerName="init" Oct 08 22:42:22 crc kubenswrapper[4834]: E1008 22:42:22.246363 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0428da9-4f94-4297-b639-c8b777b1d216" containerName="glance-db-sync" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.246372 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0428da9-4f94-4297-b639-c8b777b1d216" containerName="glance-db-sync" Oct 08 22:42:22 crc kubenswrapper[4834]: E1008 22:42:22.246391 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53de884f-17e5-4487-a0ac-64d23f12383f" containerName="keystone-bootstrap" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.246399 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="53de884f-17e5-4487-a0ac-64d23f12383f" containerName="keystone-bootstrap" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.246721 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0428da9-4f94-4297-b639-c8b777b1d216" containerName="glance-db-sync" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.246757 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc1208e7-3873-4aa5-9a30-266dcf1393d7" containerName="init" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.246767 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="53de884f-17e5-4487-a0ac-64d23f12383f" containerName="keystone-bootstrap" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.247457 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.251403 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.251427 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-c58dz" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.251707 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.252324 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.276193 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8rqtg"] Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.316226 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-config-data\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.316301 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjqmw\" (UniqueName: \"kubernetes.io/projected/399e90af-d658-4f62-8efe-3c26b5f717ef-kube-api-access-wjqmw\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.316336 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-fernet-keys\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.316353 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-credential-keys\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.316390 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-combined-ca-bundle\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.316424 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-scripts\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.385725 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69894dfcd9-kpp4z"] Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.385933 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" podUID="c5863766-1d50-4ba5-b01e-fbaff8a3dec4" containerName="dnsmasq-dns" containerID="cri-o://66bb14da89ca8710f611a5d4b30a7abb9586aa3a01ad78b966a65b229a7abd61" gracePeriod=10 Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.387466 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.417751 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-config-data\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.417812 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjqmw\" (UniqueName: \"kubernetes.io/projected/399e90af-d658-4f62-8efe-3c26b5f717ef-kube-api-access-wjqmw\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.417843 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-fernet-keys\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.417857 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-credential-keys\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.417895 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-combined-ca-bundle\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.417928 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-scripts\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.428275 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-credential-keys\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.429502 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-combined-ca-bundle\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.429642 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-scripts\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.432129 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-config-data\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.439788 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-fernet-keys\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.443801 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77dd5cf987-ns26j"] Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.445368 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.468472 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77dd5cf987-ns26j"] Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.507759 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjqmw\" (UniqueName: \"kubernetes.io/projected/399e90af-d658-4f62-8efe-3c26b5f717ef-kube-api-access-wjqmw\") pod \"keystone-bootstrap-8rqtg\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.519252 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-dns-svc\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.519482 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-config\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.520159 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-ovsdbserver-nb\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.520269 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-ovsdbserver-sb\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.520426 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-dns-swift-storage-0\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.520529 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr7tg\" (UniqueName: \"kubernetes.io/projected/22c740c5-2fd7-47bf-b34a-e4df82a1c970-kube-api-access-tr7tg\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.574780 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.622204 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-ovsdbserver-nb\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.622250 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-ovsdbserver-sb\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.622298 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-dns-swift-storage-0\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.622326 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr7tg\" (UniqueName: \"kubernetes.io/projected/22c740c5-2fd7-47bf-b34a-e4df82a1c970-kube-api-access-tr7tg\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.622381 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-dns-svc\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.622412 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-config\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.623354 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-dns-svc\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.623351 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-config\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.623352 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-ovsdbserver-sb\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.629537 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-dns-swift-storage-0\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.629775 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-ovsdbserver-nb\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.646436 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr7tg\" (UniqueName: \"kubernetes.io/projected/22c740c5-2fd7-47bf-b34a-e4df82a1c970-kube-api-access-tr7tg\") pod \"dnsmasq-dns-77dd5cf987-ns26j\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.798677 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" podUID="c5863766-1d50-4ba5-b01e-fbaff8a3dec4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: connect: connection refused" Oct 08 22:42:22 crc kubenswrapper[4834]: I1008 22:42:22.871562 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.294795 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.296129 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.298903 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-fq748" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.299313 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.300049 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.309344 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.342399 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.342458 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.342504 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.342574 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5ba36b-3681-4697-b6fc-91f998504e84-logs\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.342602 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf5ba36b-3681-4697-b6fc-91f998504e84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.342628 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdg8r\" (UniqueName: \"kubernetes.io/projected/cf5ba36b-3681-4697-b6fc-91f998504e84-kube-api-access-cdg8r\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.342703 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.444215 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.444253 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.444280 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.444329 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5ba36b-3681-4697-b6fc-91f998504e84-logs\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.444345 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf5ba36b-3681-4697-b6fc-91f998504e84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.444364 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdg8r\" (UniqueName: \"kubernetes.io/projected/cf5ba36b-3681-4697-b6fc-91f998504e84-kube-api-access-cdg8r\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.444417 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.444785 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.445324 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5ba36b-3681-4697-b6fc-91f998504e84-logs\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.445388 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf5ba36b-3681-4697-b6fc-91f998504e84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.449313 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.450411 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.459047 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.467779 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdg8r\" (UniqueName: \"kubernetes.io/projected/cf5ba36b-3681-4697-b6fc-91f998504e84-kube-api-access-cdg8r\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.475807 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.523918 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.525766 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.538473 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.538888 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.584697 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53de884f-17e5-4487-a0ac-64d23f12383f" path="/var/lib/kubelet/pods/53de884f-17e5-4487-a0ac-64d23f12383f/volumes" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.619348 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.646880 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.647365 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.647460 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.647515 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.647601 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.647633 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-logs\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.647663 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfwd2\" (UniqueName: \"kubernetes.io/projected/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-kube-api-access-xfwd2\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.748940 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.749016 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.749098 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.749128 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-logs\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.749330 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfwd2\" (UniqueName: \"kubernetes.io/projected/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-kube-api-access-xfwd2\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.749461 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.749549 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.749822 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.750085 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.752115 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-logs\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.753952 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.755171 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.773051 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfwd2\" (UniqueName: \"kubernetes.io/projected/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-kube-api-access-xfwd2\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.774756 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.781577 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:42:23 crc kubenswrapper[4834]: I1008 22:42:23.864255 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:42:24 crc kubenswrapper[4834]: I1008 22:42:24.591737 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:42:24 crc kubenswrapper[4834]: I1008 22:42:24.656061 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:42:24 crc kubenswrapper[4834]: I1008 22:42:24.842647 4834 generic.go:334] "Generic (PLEG): container finished" podID="c5863766-1d50-4ba5-b01e-fbaff8a3dec4" containerID="66bb14da89ca8710f611a5d4b30a7abb9586aa3a01ad78b966a65b229a7abd61" exitCode=0 Oct 08 22:42:24 crc kubenswrapper[4834]: I1008 22:42:24.842691 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" event={"ID":"c5863766-1d50-4ba5-b01e-fbaff8a3dec4","Type":"ContainerDied","Data":"66bb14da89ca8710f611a5d4b30a7abb9586aa3a01ad78b966a65b229a7abd61"} Oct 08 22:42:27 crc kubenswrapper[4834]: I1008 22:42:27.798939 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" podUID="c5863766-1d50-4ba5-b01e-fbaff8a3dec4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: connect: connection refused" Oct 08 22:42:30 crc kubenswrapper[4834]: E1008 22:42:30.577995 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:86daeb9c834bfcedb533086dff59a6b5b6e832b94ce2a9116337f8736bb80032" Oct 08 22:42:30 crc kubenswrapper[4834]: E1008 22:42:30.578654 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:86daeb9c834bfcedb533086dff59a6b5b6e832b94ce2a9116337f8736bb80032,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n96h698hbbh8fh5f5h67fh7h5d9hd5h586h54fh554hfbh5b6h96hbbh647h5d6hcdh5c8h55dh566hb6h5f9h67bh9bh697h578h6dh66fh59ch677q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8jhm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1a479306-f4e1-49a0-9e1d-4ba54ecedf90): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:42:31 crc kubenswrapper[4834]: E1008 22:42:30.999666 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:cbe345acb37e57986ecf6685d28c72d0e639bdb493a18e9d3ba947d6c3a16384" Oct 08 22:42:31 crc kubenswrapper[4834]: E1008 22:42:31.000261 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:cbe345acb37e57986ecf6685d28c72d0e639bdb493a18e9d3ba947d6c3a16384,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b87xv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-g2wdt_openstack(3d656893-3446-42fe-86ad-74e1b9d7ecd5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:42:31 crc kubenswrapper[4834]: E1008 22:42:31.001870 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-g2wdt" podUID="3d656893-3446-42fe-86ad-74e1b9d7ecd5" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.268777 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.299619 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-dns-svc\") pod \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.299715 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-ovsdbserver-nb\") pod \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.300569 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-config\") pod \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.300606 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbwbk\" (UniqueName: \"kubernetes.io/projected/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-kube-api-access-zbwbk\") pod \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.300698 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-dns-swift-storage-0\") pod \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.300726 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-ovsdbserver-sb\") pod \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\" (UID: \"c5863766-1d50-4ba5-b01e-fbaff8a3dec4\") " Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.312849 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-kube-api-access-zbwbk" (OuterVolumeSpecName: "kube-api-access-zbwbk") pod "c5863766-1d50-4ba5-b01e-fbaff8a3dec4" (UID: "c5863766-1d50-4ba5-b01e-fbaff8a3dec4"). InnerVolumeSpecName "kube-api-access-zbwbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.400231 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5863766-1d50-4ba5-b01e-fbaff8a3dec4" (UID: "c5863766-1d50-4ba5-b01e-fbaff8a3dec4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.403381 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.403421 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbwbk\" (UniqueName: \"kubernetes.io/projected/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-kube-api-access-zbwbk\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.415293 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c5863766-1d50-4ba5-b01e-fbaff8a3dec4" (UID: "c5863766-1d50-4ba5-b01e-fbaff8a3dec4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.422818 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c5863766-1d50-4ba5-b01e-fbaff8a3dec4" (UID: "c5863766-1d50-4ba5-b01e-fbaff8a3dec4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.424808 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-config" (OuterVolumeSpecName: "config") pod "c5863766-1d50-4ba5-b01e-fbaff8a3dec4" (UID: "c5863766-1d50-4ba5-b01e-fbaff8a3dec4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.431686 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5863766-1d50-4ba5-b01e-fbaff8a3dec4" (UID: "c5863766-1d50-4ba5-b01e-fbaff8a3dec4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.505435 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.505463 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.505472 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.505482 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5863766-1d50-4ba5-b01e-fbaff8a3dec4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.600944 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77dd5cf987-ns26j"] Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.744756 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8rqtg"] Oct 08 22:42:31 crc kubenswrapper[4834]: W1008 22:42:31.755415 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod399e90af_d658_4f62_8efe_3c26b5f717ef.slice/crio-152beed6bfb5c79b0e87fc80f517459e14e65c96499f02b628beb651fa046dba WatchSource:0}: Error finding container 152beed6bfb5c79b0e87fc80f517459e14e65c96499f02b628beb651fa046dba: Status 404 returned error can't find the container with id 152beed6bfb5c79b0e87fc80f517459e14e65c96499f02b628beb651fa046dba Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.910054 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8rqtg" event={"ID":"399e90af-d658-4f62-8efe-3c26b5f717ef","Type":"ContainerStarted","Data":"152beed6bfb5c79b0e87fc80f517459e14e65c96499f02b628beb651fa046dba"} Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.912783 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" event={"ID":"c5863766-1d50-4ba5-b01e-fbaff8a3dec4","Type":"ContainerDied","Data":"d7abfbb66dbb1acc399a87d5090e9768023ad352ea876421255fb1cc1dbaac63"} Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.912841 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69894dfcd9-kpp4z" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.912862 4834 scope.go:117] "RemoveContainer" containerID="66bb14da89ca8710f611a5d4b30a7abb9586aa3a01ad78b966a65b229a7abd61" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.916847 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mhj54" event={"ID":"53c0210a-93d4-4f54-a542-d69c77229b9e","Type":"ContainerStarted","Data":"43fb96c167efdaaa1299cb06ffba0aaaeaff5f3e9c0bac460dec7ea52e491fdd"} Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.919479 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" event={"ID":"22c740c5-2fd7-47bf-b34a-e4df82a1c970","Type":"ContainerStarted","Data":"b58d03a3fff83c365590c44f6006233a20d8efc6ecb0cc5961d3aa00211e24d4"} Oct 08 22:42:31 crc kubenswrapper[4834]: E1008 22:42:31.921258 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:cbe345acb37e57986ecf6685d28c72d0e639bdb493a18e9d3ba947d6c3a16384\\\"\"" pod="openstack/barbican-db-sync-g2wdt" podUID="3d656893-3446-42fe-86ad-74e1b9d7ecd5" Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.934674 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69894dfcd9-kpp4z"] Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.940081 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69894dfcd9-kpp4z"] Oct 08 22:42:31 crc kubenswrapper[4834]: I1008 22:42:31.966670 4834 scope.go:117] "RemoveContainer" containerID="3fb7dabff3ee591dafcfd0cfa826cee50db2e71a135715bbc283a782c42e5abd" Oct 08 22:42:32 crc kubenswrapper[4834]: I1008 22:42:32.213290 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:42:32 crc kubenswrapper[4834]: I1008 22:42:32.795454 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:42:32 crc kubenswrapper[4834]: W1008 22:42:32.813657 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf5ba36b_3681_4697_b6fc_91f998504e84.slice/crio-7c4f490dffbd1d5f4007b61cec63cc5047fa558789b3dc54876f0d8ee6761503 WatchSource:0}: Error finding container 7c4f490dffbd1d5f4007b61cec63cc5047fa558789b3dc54876f0d8ee6761503: Status 404 returned error can't find the container with id 7c4f490dffbd1d5f4007b61cec63cc5047fa558789b3dc54876f0d8ee6761503 Oct 08 22:42:32 crc kubenswrapper[4834]: I1008 22:42:32.931116 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8rqtg" event={"ID":"399e90af-d658-4f62-8efe-3c26b5f717ef","Type":"ContainerStarted","Data":"2aa265b22953ecc5556abee66523e5673bc96ed54d9755bb598e92161a98743d"} Oct 08 22:42:32 crc kubenswrapper[4834]: I1008 22:42:32.937817 4834 generic.go:334] "Generic (PLEG): container finished" podID="22c740c5-2fd7-47bf-b34a-e4df82a1c970" containerID="33e4af26bbf4130eedb59c9e1f95713af28585725ac37d9b6497b7c23f8f115f" exitCode=0 Oct 08 22:42:32 crc kubenswrapper[4834]: I1008 22:42:32.937883 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" event={"ID":"22c740c5-2fd7-47bf-b34a-e4df82a1c970","Type":"ContainerDied","Data":"33e4af26bbf4130eedb59c9e1f95713af28585725ac37d9b6497b7c23f8f115f"} Oct 08 22:42:32 crc kubenswrapper[4834]: I1008 22:42:32.940259 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a479306-f4e1-49a0-9e1d-4ba54ecedf90","Type":"ContainerStarted","Data":"9a9e9002a8f5e88c573acb171e6833d4bc4a8f7a89c1f47022ef5b011f236409"} Oct 08 22:42:32 crc kubenswrapper[4834]: I1008 22:42:32.941401 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9","Type":"ContainerStarted","Data":"01c07130a858b0ca5b5c3c9d32a7d9fa2b70dd228376d0de7735969a4896a72d"} Oct 08 22:42:32 crc kubenswrapper[4834]: I1008 22:42:32.942474 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf5ba36b-3681-4697-b6fc-91f998504e84","Type":"ContainerStarted","Data":"7c4f490dffbd1d5f4007b61cec63cc5047fa558789b3dc54876f0d8ee6761503"} Oct 08 22:42:32 crc kubenswrapper[4834]: I1008 22:42:32.954707 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8rqtg" podStartSLOduration=10.954690345 podStartE2EDuration="10.954690345s" podCreationTimestamp="2025-10-08 22:42:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:42:32.952906822 +0000 UTC m=+1160.775791578" watchObservedRunningTime="2025-10-08 22:42:32.954690345 +0000 UTC m=+1160.777575091" Oct 08 22:42:32 crc kubenswrapper[4834]: I1008 22:42:32.973029 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-mhj54" podStartSLOduration=3.345286879 podStartE2EDuration="20.973011547s" podCreationTimestamp="2025-10-08 22:42:12 +0000 UTC" firstStartedPulling="2025-10-08 22:42:13.400554159 +0000 UTC m=+1141.223438905" lastFinishedPulling="2025-10-08 22:42:31.028278817 +0000 UTC m=+1158.851163573" observedRunningTime="2025-10-08 22:42:32.97059952 +0000 UTC m=+1160.793484266" watchObservedRunningTime="2025-10-08 22:42:32.973011547 +0000 UTC m=+1160.795896293" Oct 08 22:42:33 crc kubenswrapper[4834]: I1008 22:42:33.570234 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5863766-1d50-4ba5-b01e-fbaff8a3dec4" path="/var/lib/kubelet/pods/c5863766-1d50-4ba5-b01e-fbaff8a3dec4/volumes" Oct 08 22:42:33 crc kubenswrapper[4834]: I1008 22:42:33.954453 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9","Type":"ContainerStarted","Data":"3b19e5a972cc89c1fbd8b19ee8a39a3a625931ab58a9f2c8c08464d02344da66"} Oct 08 22:42:35 crc kubenswrapper[4834]: I1008 22:42:35.973458 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf5ba36b-3681-4697-b6fc-91f998504e84","Type":"ContainerStarted","Data":"6e5718317ef00fa74ccd0dc6826c9b1fdbaa6dcbb6698c6420f2a3ab2ccf5e06"} Oct 08 22:42:36 crc kubenswrapper[4834]: I1008 22:42:36.984713 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" event={"ID":"22c740c5-2fd7-47bf-b34a-e4df82a1c970","Type":"ContainerStarted","Data":"cc29667f1378c23afae6203d0838cde7a3a89a414594717058bb98319b018457"} Oct 08 22:42:36 crc kubenswrapper[4834]: I1008 22:42:36.985100 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:36 crc kubenswrapper[4834]: I1008 22:42:36.988159 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9","Type":"ContainerStarted","Data":"065422b62f29ebb770a2a17cf4901b746e4a831b80ca68c24a38396fea95961a"} Oct 08 22:42:36 crc kubenswrapper[4834]: I1008 22:42:36.988251 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" containerName="glance-log" containerID="cri-o://3b19e5a972cc89c1fbd8b19ee8a39a3a625931ab58a9f2c8c08464d02344da66" gracePeriod=30 Oct 08 22:42:36 crc kubenswrapper[4834]: I1008 22:42:36.988292 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" containerName="glance-httpd" containerID="cri-o://065422b62f29ebb770a2a17cf4901b746e4a831b80ca68c24a38396fea95961a" gracePeriod=30 Oct 08 22:42:36 crc kubenswrapper[4834]: I1008 22:42:36.990224 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf5ba36b-3681-4697-b6fc-91f998504e84","Type":"ContainerStarted","Data":"2b726a6b152c919d8738f4cff88285591245d9b7621ed51082be530db5750148"} Oct 08 22:42:36 crc kubenswrapper[4834]: I1008 22:42:36.990315 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cf5ba36b-3681-4697-b6fc-91f998504e84" containerName="glance-log" containerID="cri-o://6e5718317ef00fa74ccd0dc6826c9b1fdbaa6dcbb6698c6420f2a3ab2ccf5e06" gracePeriod=30 Oct 08 22:42:36 crc kubenswrapper[4834]: I1008 22:42:36.990383 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cf5ba36b-3681-4697-b6fc-91f998504e84" containerName="glance-httpd" containerID="cri-o://2b726a6b152c919d8738f4cff88285591245d9b7621ed51082be530db5750148" gracePeriod=30 Oct 08 22:42:37 crc kubenswrapper[4834]: I1008 22:42:37.003457 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" podStartSLOduration=15.003440623 podStartE2EDuration="15.003440623s" podCreationTimestamp="2025-10-08 22:42:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:42:36.998484923 +0000 UTC m=+1164.821369669" watchObservedRunningTime="2025-10-08 22:42:37.003440623 +0000 UTC m=+1164.826325369" Oct 08 22:42:37 crc kubenswrapper[4834]: I1008 22:42:37.019536 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=15.019518061 podStartE2EDuration="15.019518061s" podCreationTimestamp="2025-10-08 22:42:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:42:37.017520093 +0000 UTC m=+1164.840404839" watchObservedRunningTime="2025-10-08 22:42:37.019518061 +0000 UTC m=+1164.842402807" Oct 08 22:42:37 crc kubenswrapper[4834]: I1008 22:42:37.038804 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=15.038787766 podStartE2EDuration="15.038787766s" podCreationTimestamp="2025-10-08 22:42:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:42:37.03559923 +0000 UTC m=+1164.858483976" watchObservedRunningTime="2025-10-08 22:42:37.038787766 +0000 UTC m=+1164.861672512" Oct 08 22:42:38 crc kubenswrapper[4834]: I1008 22:42:38.006602 4834 generic.go:334] "Generic (PLEG): container finished" podID="cf5ba36b-3681-4697-b6fc-91f998504e84" containerID="6e5718317ef00fa74ccd0dc6826c9b1fdbaa6dcbb6698c6420f2a3ab2ccf5e06" exitCode=143 Oct 08 22:42:38 crc kubenswrapper[4834]: I1008 22:42:38.006665 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf5ba36b-3681-4697-b6fc-91f998504e84","Type":"ContainerDied","Data":"6e5718317ef00fa74ccd0dc6826c9b1fdbaa6dcbb6698c6420f2a3ab2ccf5e06"} Oct 08 22:42:38 crc kubenswrapper[4834]: I1008 22:42:38.009856 4834 generic.go:334] "Generic (PLEG): container finished" podID="f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" containerID="3b19e5a972cc89c1fbd8b19ee8a39a3a625931ab58a9f2c8c08464d02344da66" exitCode=143 Oct 08 22:42:38 crc kubenswrapper[4834]: I1008 22:42:38.010088 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9","Type":"ContainerDied","Data":"3b19e5a972cc89c1fbd8b19ee8a39a3a625931ab58a9f2c8c08464d02344da66"} Oct 08 22:42:39 crc kubenswrapper[4834]: I1008 22:42:39.018957 4834 generic.go:334] "Generic (PLEG): container finished" podID="f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" containerID="065422b62f29ebb770a2a17cf4901b746e4a831b80ca68c24a38396fea95961a" exitCode=0 Oct 08 22:42:39 crc kubenswrapper[4834]: I1008 22:42:39.019024 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9","Type":"ContainerDied","Data":"065422b62f29ebb770a2a17cf4901b746e4a831b80ca68c24a38396fea95961a"} Oct 08 22:42:39 crc kubenswrapper[4834]: I1008 22:42:39.021177 4834 generic.go:334] "Generic (PLEG): container finished" podID="cf5ba36b-3681-4697-b6fc-91f998504e84" containerID="2b726a6b152c919d8738f4cff88285591245d9b7621ed51082be530db5750148" exitCode=0 Oct 08 22:42:39 crc kubenswrapper[4834]: I1008 22:42:39.021248 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf5ba36b-3681-4697-b6fc-91f998504e84","Type":"ContainerDied","Data":"2b726a6b152c919d8738f4cff88285591245d9b7621ed51082be530db5750148"} Oct 08 22:42:42 crc kubenswrapper[4834]: I1008 22:42:42.045612 4834 generic.go:334] "Generic (PLEG): container finished" podID="399e90af-d658-4f62-8efe-3c26b5f717ef" containerID="2aa265b22953ecc5556abee66523e5673bc96ed54d9755bb598e92161a98743d" exitCode=0 Oct 08 22:42:42 crc kubenswrapper[4834]: I1008 22:42:42.045701 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8rqtg" event={"ID":"399e90af-d658-4f62-8efe-3c26b5f717ef","Type":"ContainerDied","Data":"2aa265b22953ecc5556abee66523e5673bc96ed54d9755bb598e92161a98743d"} Oct 08 22:42:42 crc kubenswrapper[4834]: I1008 22:42:42.873402 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:42:42 crc kubenswrapper[4834]: I1008 22:42:42.996071 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59f45f6cf7-4gxlc"] Oct 08 22:42:42 crc kubenswrapper[4834]: I1008 22:42:42.996486 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" podUID="532b6dee-483f-40e2-a1a6-d2d9af582e97" containerName="dnsmasq-dns" containerID="cri-o://bda88da6127166843e209ec22e5eb1b0a712814c0eb0fbc484f3b3b9baa29e6d" gracePeriod=10 Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:47.013222 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" podUID="532b6dee-483f-40e2-a1a6-d2d9af582e97" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:47.025955 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:47.026016 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:47.026059 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:47.026753 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae7fc299ba63d30578a076e14832e2ba3dd0a6f32f375b1c858285b17f026ca6"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:47.026809 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://ae7fc299ba63d30578a076e14832e2ba3dd0a6f32f375b1c858285b17f026ca6" gracePeriod=600 Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:48.114712 4834 generic.go:334] "Generic (PLEG): container finished" podID="532b6dee-483f-40e2-a1a6-d2d9af582e97" containerID="bda88da6127166843e209ec22e5eb1b0a712814c0eb0fbc484f3b3b9baa29e6d" exitCode=0 Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:48.114809 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" event={"ID":"532b6dee-483f-40e2-a1a6-d2d9af582e97","Type":"ContainerDied","Data":"bda88da6127166843e209ec22e5eb1b0a712814c0eb0fbc484f3b3b9baa29e6d"} Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:48.121647 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="ae7fc299ba63d30578a076e14832e2ba3dd0a6f32f375b1c858285b17f026ca6" exitCode=0 Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:48.121722 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"ae7fc299ba63d30578a076e14832e2ba3dd0a6f32f375b1c858285b17f026ca6"} Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:48.121955 4834 scope.go:117] "RemoveContainer" containerID="c4baa6db4e38cdb0c141b99f90c2bb4b5f7f47f94b55109a60fc26c2c73b21d9" Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:52.013566 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" podUID="532b6dee-483f-40e2-a1a6-d2d9af582e97" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:53.620522 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:53.621054 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:53.865947 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 22:42:55 crc kubenswrapper[4834]: I1008 22:42:53.866050 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 22:42:57 crc kubenswrapper[4834]: I1008 22:42:57.013194 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" podUID="532b6dee-483f-40e2-a1a6-d2d9af582e97" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Oct 08 22:42:57 crc kubenswrapper[4834]: I1008 22:42:57.013561 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:43:01 crc kubenswrapper[4834]: E1008 22:43:01.225313 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:cbe345acb37e57986ecf6685d28c72d0e639bdb493a18e9d3ba947d6c3a16384" Oct 08 22:43:01 crc kubenswrapper[4834]: E1008 22:43:01.226026 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:cbe345acb37e57986ecf6685d28c72d0e639bdb493a18e9d3ba947d6c3a16384,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b87xv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-g2wdt_openstack(3d656893-3446-42fe-86ad-74e1b9d7ecd5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:43:01 crc kubenswrapper[4834]: E1008 22:43:01.227558 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-g2wdt" podUID="3d656893-3446-42fe-86ad-74e1b9d7ecd5" Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.260953 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8rqtg" event={"ID":"399e90af-d658-4f62-8efe-3c26b5f717ef","Type":"ContainerDied","Data":"152beed6bfb5c79b0e87fc80f517459e14e65c96499f02b628beb651fa046dba"} Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.260991 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="152beed6bfb5c79b0e87fc80f517459e14e65c96499f02b628beb651fa046dba" Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.294865 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.446387 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-scripts\") pod \"399e90af-d658-4f62-8efe-3c26b5f717ef\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.446855 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-fernet-keys\") pod \"399e90af-d658-4f62-8efe-3c26b5f717ef\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.446949 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-credential-keys\") pod \"399e90af-d658-4f62-8efe-3c26b5f717ef\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.447053 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-config-data\") pod \"399e90af-d658-4f62-8efe-3c26b5f717ef\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.447089 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjqmw\" (UniqueName: \"kubernetes.io/projected/399e90af-d658-4f62-8efe-3c26b5f717ef-kube-api-access-wjqmw\") pod \"399e90af-d658-4f62-8efe-3c26b5f717ef\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.447125 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-combined-ca-bundle\") pod \"399e90af-d658-4f62-8efe-3c26b5f717ef\" (UID: \"399e90af-d658-4f62-8efe-3c26b5f717ef\") " Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.452592 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "399e90af-d658-4f62-8efe-3c26b5f717ef" (UID: "399e90af-d658-4f62-8efe-3c26b5f717ef"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.453467 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/399e90af-d658-4f62-8efe-3c26b5f717ef-kube-api-access-wjqmw" (OuterVolumeSpecName: "kube-api-access-wjqmw") pod "399e90af-d658-4f62-8efe-3c26b5f717ef" (UID: "399e90af-d658-4f62-8efe-3c26b5f717ef"). InnerVolumeSpecName "kube-api-access-wjqmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.453583 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "399e90af-d658-4f62-8efe-3c26b5f717ef" (UID: "399e90af-d658-4f62-8efe-3c26b5f717ef"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.454201 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-scripts" (OuterVolumeSpecName: "scripts") pod "399e90af-d658-4f62-8efe-3c26b5f717ef" (UID: "399e90af-d658-4f62-8efe-3c26b5f717ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.482231 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-config-data" (OuterVolumeSpecName: "config-data") pod "399e90af-d658-4f62-8efe-3c26b5f717ef" (UID: "399e90af-d658-4f62-8efe-3c26b5f717ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.496180 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "399e90af-d658-4f62-8efe-3c26b5f717ef" (UID: "399e90af-d658-4f62-8efe-3c26b5f717ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.549576 4834 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.549613 4834 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.549629 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.549643 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjqmw\" (UniqueName: \"kubernetes.io/projected/399e90af-d658-4f62-8efe-3c26b5f717ef-kube-api-access-wjqmw\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.549662 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:01 crc kubenswrapper[4834]: I1008 22:43:01.549675 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/399e90af-d658-4f62-8efe-3c26b5f717ef-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.013099 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" podUID="532b6dee-483f-40e2-a1a6-d2d9af582e97" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.269062 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8rqtg" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.424617 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-57cf4d469b-9sj2l"] Oct 08 22:43:02 crc kubenswrapper[4834]: E1008 22:43:02.425333 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5863766-1d50-4ba5-b01e-fbaff8a3dec4" containerName="dnsmasq-dns" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.425352 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5863766-1d50-4ba5-b01e-fbaff8a3dec4" containerName="dnsmasq-dns" Oct 08 22:43:02 crc kubenswrapper[4834]: E1008 22:43:02.425379 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5863766-1d50-4ba5-b01e-fbaff8a3dec4" containerName="init" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.425386 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5863766-1d50-4ba5-b01e-fbaff8a3dec4" containerName="init" Oct 08 22:43:02 crc kubenswrapper[4834]: E1008 22:43:02.425424 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399e90af-d658-4f62-8efe-3c26b5f717ef" containerName="keystone-bootstrap" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.425434 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="399e90af-d658-4f62-8efe-3c26b5f717ef" containerName="keystone-bootstrap" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.425623 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5863766-1d50-4ba5-b01e-fbaff8a3dec4" containerName="dnsmasq-dns" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.425645 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="399e90af-d658-4f62-8efe-3c26b5f717ef" containerName="keystone-bootstrap" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.426396 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.445759 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.446271 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.445801 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.446111 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.447564 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.455850 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-c58dz" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.455866 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-57cf4d469b-9sj2l"] Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.466717 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68b9s\" (UniqueName: \"kubernetes.io/projected/788f2464-05b4-4c9a-bd83-6c1365740166-kube-api-access-68b9s\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.466778 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-fernet-keys\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.466893 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-credential-keys\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.466927 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-public-tls-certs\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.467016 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-config-data\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.467070 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-combined-ca-bundle\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.467122 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-internal-tls-certs\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.467249 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-scripts\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.568931 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-credential-keys\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.569702 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-public-tls-certs\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.569758 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-config-data\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.569789 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-combined-ca-bundle\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.569854 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-internal-tls-certs\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.569974 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-scripts\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.570063 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68b9s\" (UniqueName: \"kubernetes.io/projected/788f2464-05b4-4c9a-bd83-6c1365740166-kube-api-access-68b9s\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.570112 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-fernet-keys\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.574374 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-scripts\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.574396 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-fernet-keys\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.576280 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-public-tls-certs\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.576459 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-config-data\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.576558 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-internal-tls-certs\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.577037 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-combined-ca-bundle\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.577066 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-credential-keys\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.588254 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68b9s\" (UniqueName: \"kubernetes.io/projected/788f2464-05b4-4c9a-bd83-6c1365740166-kube-api-access-68b9s\") pod \"keystone-57cf4d469b-9sj2l\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:02 crc kubenswrapper[4834]: I1008 22:43:02.740501 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:04 crc kubenswrapper[4834]: E1008 22:43:04.954959 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f" Oct 08 22:43:04 crc kubenswrapper[4834]: E1008 22:43:04.955733 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kckgb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-449pd_openstack(15a24e03-f3be-433f-bbc1-3a25da713c65): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:43:04 crc kubenswrapper[4834]: E1008 22:43:04.957068 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-449pd" podUID="15a24e03-f3be-433f-bbc1-3a25da713c65" Oct 08 22:43:05 crc kubenswrapper[4834]: E1008 22:43:05.298400 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f\\\"\"" pod="openstack/cinder-db-sync-449pd" podUID="15a24e03-f3be-433f-bbc1-3a25da713c65" Oct 08 22:43:12 crc kubenswrapper[4834]: I1008 22:43:12.013339 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" podUID="532b6dee-483f-40e2-a1a6-d2d9af582e97" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: i/o timeout" Oct 08 22:43:13 crc kubenswrapper[4834]: E1008 22:43:13.029913 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1" Oct 08 22:43:13 crc kubenswrapper[4834]: E1008 22:43:13.030491 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:sg-core,Image:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:sg-core-conf-yaml,ReadOnly:false,MountPath:/etc/sg-core.conf.yaml,SubPath:sg-core.conf.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8jhm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1a479306-f4e1-49a0-9e1d-4ba54ecedf90): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.168372 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.205477 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.226260 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.267563 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-scripts\") pod \"cf5ba36b-3681-4697-b6fc-91f998504e84\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.267639 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdg8r\" (UniqueName: \"kubernetes.io/projected/cf5ba36b-3681-4697-b6fc-91f998504e84-kube-api-access-cdg8r\") pod \"cf5ba36b-3681-4697-b6fc-91f998504e84\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.267690 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cf5ba36b-3681-4697-b6fc-91f998504e84\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.267726 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5ba36b-3681-4697-b6fc-91f998504e84-logs\") pod \"cf5ba36b-3681-4697-b6fc-91f998504e84\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.267776 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf5ba36b-3681-4697-b6fc-91f998504e84-httpd-run\") pod \"cf5ba36b-3681-4697-b6fc-91f998504e84\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.267818 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-config-data\") pod \"cf5ba36b-3681-4697-b6fc-91f998504e84\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.267883 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-combined-ca-bundle\") pod \"cf5ba36b-3681-4697-b6fc-91f998504e84\" (UID: \"cf5ba36b-3681-4697-b6fc-91f998504e84\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.269616 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf5ba36b-3681-4697-b6fc-91f998504e84-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cf5ba36b-3681-4697-b6fc-91f998504e84" (UID: "cf5ba36b-3681-4697-b6fc-91f998504e84"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.269768 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf5ba36b-3681-4697-b6fc-91f998504e84-logs" (OuterVolumeSpecName: "logs") pod "cf5ba36b-3681-4697-b6fc-91f998504e84" (UID: "cf5ba36b-3681-4697-b6fc-91f998504e84"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.276253 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-scripts" (OuterVolumeSpecName: "scripts") pod "cf5ba36b-3681-4697-b6fc-91f998504e84" (UID: "cf5ba36b-3681-4697-b6fc-91f998504e84"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.276582 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf5ba36b-3681-4697-b6fc-91f998504e84-kube-api-access-cdg8r" (OuterVolumeSpecName: "kube-api-access-cdg8r") pod "cf5ba36b-3681-4697-b6fc-91f998504e84" (UID: "cf5ba36b-3681-4697-b6fc-91f998504e84"). InnerVolumeSpecName "kube-api-access-cdg8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.286979 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "cf5ba36b-3681-4697-b6fc-91f998504e84" (UID: "cf5ba36b-3681-4697-b6fc-91f998504e84"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.306838 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf5ba36b-3681-4697-b6fc-91f998504e84" (UID: "cf5ba36b-3681-4697-b6fc-91f998504e84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.318822 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-config-data" (OuterVolumeSpecName: "config-data") pod "cf5ba36b-3681-4697-b6fc-91f998504e84" (UID: "cf5ba36b-3681-4697-b6fc-91f998504e84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.368879 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-config-data\") pod \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.368929 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-ovsdbserver-sb\") pod \"532b6dee-483f-40e2-a1a6-d2d9af582e97\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.368966 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlpfr\" (UniqueName: \"kubernetes.io/projected/532b6dee-483f-40e2-a1a6-d2d9af582e97-kube-api-access-dlpfr\") pod \"532b6dee-483f-40e2-a1a6-d2d9af582e97\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.368997 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-config\") pod \"532b6dee-483f-40e2-a1a6-d2d9af582e97\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369022 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369041 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-dns-swift-storage-0\") pod \"532b6dee-483f-40e2-a1a6-d2d9af582e97\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369079 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfwd2\" (UniqueName: \"kubernetes.io/projected/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-kube-api-access-xfwd2\") pod \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369095 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-ovsdbserver-nb\") pod \"532b6dee-483f-40e2-a1a6-d2d9af582e97\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369111 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-httpd-run\") pod \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369194 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-dns-svc\") pod \"532b6dee-483f-40e2-a1a6-d2d9af582e97\" (UID: \"532b6dee-483f-40e2-a1a6-d2d9af582e97\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369226 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-combined-ca-bundle\") pod \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369251 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-scripts\") pod \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369266 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-logs\") pod \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\" (UID: \"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9\") " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369619 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369636 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369668 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdg8r\" (UniqueName: \"kubernetes.io/projected/cf5ba36b-3681-4697-b6fc-91f998504e84-kube-api-access-cdg8r\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369688 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369697 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf5ba36b-3681-4697-b6fc-91f998504e84-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369707 4834 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf5ba36b-3681-4697-b6fc-91f998504e84-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.369716 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf5ba36b-3681-4697-b6fc-91f998504e84-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.371469 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-logs" (OuterVolumeSpecName: "logs") pod "f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" (UID: "f62b1c4e-e49f-4d2f-b08c-5d4362517cf9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.374354 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" (UID: "f62b1c4e-e49f-4d2f-b08c-5d4362517cf9"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.374633 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" (UID: "f62b1c4e-e49f-4d2f-b08c-5d4362517cf9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.376292 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-kube-api-access-xfwd2" (OuterVolumeSpecName: "kube-api-access-xfwd2") pod "f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" (UID: "f62b1c4e-e49f-4d2f-b08c-5d4362517cf9"). InnerVolumeSpecName "kube-api-access-xfwd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.376695 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-scripts" (OuterVolumeSpecName: "scripts") pod "f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" (UID: "f62b1c4e-e49f-4d2f-b08c-5d4362517cf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.386035 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532b6dee-483f-40e2-a1a6-d2d9af582e97-kube-api-access-dlpfr" (OuterVolumeSpecName: "kube-api-access-dlpfr") pod "532b6dee-483f-40e2-a1a6-d2d9af582e97" (UID: "532b6dee-483f-40e2-a1a6-d2d9af582e97"). InnerVolumeSpecName "kube-api-access-dlpfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.392557 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.402788 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" event={"ID":"532b6dee-483f-40e2-a1a6-d2d9af582e97","Type":"ContainerDied","Data":"4e9c98aada1cc187a20ba326096da1f6db2366f0849c47c1b3206e018d283c3b"} Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.402841 4834 scope.go:117] "RemoveContainer" containerID="bda88da6127166843e209ec22e5eb1b0a712814c0eb0fbc484f3b3b9baa29e6d" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.402927 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.422268 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f62b1c4e-e49f-4d2f-b08c-5d4362517cf9","Type":"ContainerDied","Data":"01c07130a858b0ca5b5c3c9d32a7d9fa2b70dd228376d0de7735969a4896a72d"} Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.422340 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.426603 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" (UID: "f62b1c4e-e49f-4d2f-b08c-5d4362517cf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.428566 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf5ba36b-3681-4697-b6fc-91f998504e84","Type":"ContainerDied","Data":"7c4f490dffbd1d5f4007b61cec63cc5047fa558789b3dc54876f0d8ee6761503"} Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.428630 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.432753 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "532b6dee-483f-40e2-a1a6-d2d9af582e97" (UID: "532b6dee-483f-40e2-a1a6-d2d9af582e97"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.440391 4834 scope.go:117] "RemoveContainer" containerID="61a624379d927b98ebda00e533f085b2f8f5d292801d76fb051fcf85248dad09" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.441058 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "532b6dee-483f-40e2-a1a6-d2d9af582e97" (UID: "532b6dee-483f-40e2-a1a6-d2d9af582e97"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.444780 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-config" (OuterVolumeSpecName: "config") pod "532b6dee-483f-40e2-a1a6-d2d9af582e97" (UID: "532b6dee-483f-40e2-a1a6-d2d9af582e97"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.447333 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "532b6dee-483f-40e2-a1a6-d2d9af582e97" (UID: "532b6dee-483f-40e2-a1a6-d2d9af582e97"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.449689 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-config-data" (OuterVolumeSpecName: "config-data") pod "f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" (UID: "f62b1c4e-e49f-4d2f-b08c-5d4362517cf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.455915 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "532b6dee-483f-40e2-a1a6-d2d9af582e97" (UID: "532b6dee-483f-40e2-a1a6-d2d9af582e97"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.463530 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.470849 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfwd2\" (UniqueName: \"kubernetes.io/projected/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-kube-api-access-xfwd2\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.470883 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.470893 4834 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.470902 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.470910 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.470919 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.470926 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.470936 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.470944 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.470952 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.470960 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlpfr\" (UniqueName: \"kubernetes.io/projected/532b6dee-483f-40e2-a1a6-d2d9af582e97-kube-api-access-dlpfr\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.470995 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.471026 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.471036 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/532b6dee-483f-40e2-a1a6-d2d9af582e97-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.471419 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.472256 4834 scope.go:117] "RemoveContainer" containerID="065422b62f29ebb770a2a17cf4901b746e4a831b80ca68c24a38396fea95961a" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.497681 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.501792 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:43:13 crc kubenswrapper[4834]: E1008 22:43:13.502277 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" containerName="glance-httpd" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.502300 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" containerName="glance-httpd" Oct 08 22:43:13 crc kubenswrapper[4834]: E1008 22:43:13.502312 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5ba36b-3681-4697-b6fc-91f998504e84" containerName="glance-httpd" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.502320 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5ba36b-3681-4697-b6fc-91f998504e84" containerName="glance-httpd" Oct 08 22:43:13 crc kubenswrapper[4834]: E1008 22:43:13.502341 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532b6dee-483f-40e2-a1a6-d2d9af582e97" containerName="dnsmasq-dns" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.502349 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="532b6dee-483f-40e2-a1a6-d2d9af582e97" containerName="dnsmasq-dns" Oct 08 22:43:13 crc kubenswrapper[4834]: E1008 22:43:13.502372 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf5ba36b-3681-4697-b6fc-91f998504e84" containerName="glance-log" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.502380 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf5ba36b-3681-4697-b6fc-91f998504e84" containerName="glance-log" Oct 08 22:43:13 crc kubenswrapper[4834]: E1008 22:43:13.502401 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" containerName="glance-log" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.502409 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" containerName="glance-log" Oct 08 22:43:13 crc kubenswrapper[4834]: E1008 22:43:13.502428 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532b6dee-483f-40e2-a1a6-d2d9af582e97" containerName="init" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.502437 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="532b6dee-483f-40e2-a1a6-d2d9af582e97" containerName="init" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.502627 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" containerName="glance-httpd" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.502647 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5ba36b-3681-4697-b6fc-91f998504e84" containerName="glance-httpd" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.502655 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" containerName="glance-log" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.502675 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf5ba36b-3681-4697-b6fc-91f998504e84" containerName="glance-log" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.502689 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="532b6dee-483f-40e2-a1a6-d2d9af582e97" containerName="dnsmasq-dns" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.503842 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.505875 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.506616 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.512117 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.535059 4834 scope.go:117] "RemoveContainer" containerID="3b19e5a972cc89c1fbd8b19ee8a39a3a625931ab58a9f2c8c08464d02344da66" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.571744 4834 scope.go:117] "RemoveContainer" containerID="2b726a6b152c919d8738f4cff88285591245d9b7621ed51082be530db5750148" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.572526 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.576416 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf5ba36b-3681-4697-b6fc-91f998504e84" path="/var/lib/kubelet/pods/cf5ba36b-3681-4697-b6fc-91f998504e84/volumes" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.577521 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-57cf4d469b-9sj2l"] Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.600428 4834 scope.go:117] "RemoveContainer" containerID="6e5718317ef00fa74ccd0dc6826c9b1fdbaa6dcbb6698c6420f2a3ab2ccf5e06" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.674535 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-scripts\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.674603 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.674648 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cef930bb-c211-441f-a59f-e797704ce837-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.674663 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-config-data\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.674714 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq7s9\" (UniqueName: \"kubernetes.io/projected/cef930bb-c211-441f-a59f-e797704ce837-kube-api-access-bq7s9\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.674733 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.674771 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.674801 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cef930bb-c211-441f-a59f-e797704ce837-logs\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.733763 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59f45f6cf7-4gxlc"] Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.743181 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59f45f6cf7-4gxlc"] Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.756476 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.768314 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.776827 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq7s9\" (UniqueName: \"kubernetes.io/projected/cef930bb-c211-441f-a59f-e797704ce837-kube-api-access-bq7s9\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.776883 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.776926 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.776964 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cef930bb-c211-441f-a59f-e797704ce837-logs\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.777033 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-scripts\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.777075 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.777108 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cef930bb-c211-441f-a59f-e797704ce837-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.777132 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-config-data\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.779345 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cef930bb-c211-441f-a59f-e797704ce837-logs\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.779815 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.780090 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cef930bb-c211-441f-a59f-e797704ce837-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.783988 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.784496 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.785517 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-config-data\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.789526 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-scripts\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.797340 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.800426 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.805039 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.805268 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.822503 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.825499 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq7s9\" (UniqueName: \"kubernetes.io/projected/cef930bb-c211-441f-a59f-e797704ce837-kube-api-access-bq7s9\") pod \"glance-default-external-api-0\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.828394 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.828981 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.981380 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b34caaaa-9ad3-42b8-8537-876601474580-logs\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.981429 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft264\" (UniqueName: \"kubernetes.io/projected/b34caaaa-9ad3-42b8-8537-876601474580-kube-api-access-ft264\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.981484 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.981508 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.981542 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b34caaaa-9ad3-42b8-8537-876601474580-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.981564 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.981583 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:13 crc kubenswrapper[4834]: I1008 22:43:13.981604 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.083098 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.083415 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.083462 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b34caaaa-9ad3-42b8-8537-876601474580-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.083487 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.083506 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.083531 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.083577 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b34caaaa-9ad3-42b8-8537-876601474580-logs\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.083595 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft264\" (UniqueName: \"kubernetes.io/projected/b34caaaa-9ad3-42b8-8537-876601474580-kube-api-access-ft264\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.084210 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.085503 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b34caaaa-9ad3-42b8-8537-876601474580-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.085735 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b34caaaa-9ad3-42b8-8537-876601474580-logs\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.090191 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.091691 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.096029 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.097110 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.117829 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft264\" (UniqueName: \"kubernetes.io/projected/b34caaaa-9ad3-42b8-8537-876601474580-kube-api-access-ft264\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.135234 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.140810 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.184299 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.505397 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-57cf4d469b-9sj2l" event={"ID":"788f2464-05b4-4c9a-bd83-6c1365740166","Type":"ContainerStarted","Data":"1604636fd334c22a16f8d495172a59b50e33f1ed2b35af73ba1fc55f9dd3f3c9"} Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.505801 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-57cf4d469b-9sj2l" event={"ID":"788f2464-05b4-4c9a-bd83-6c1365740166","Type":"ContainerStarted","Data":"d84cd735daabc7206b8d9676421b437b88d922ac103ce0d626c8b224e8af4f31"} Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.505840 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.537682 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-57cf4d469b-9sj2l" podStartSLOduration=12.537667224 podStartE2EDuration="12.537667224s" podCreationTimestamp="2025-10-08 22:43:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:43:14.528318898 +0000 UTC m=+1202.351203644" watchObservedRunningTime="2025-10-08 22:43:14.537667224 +0000 UTC m=+1202.360551970" Oct 08 22:43:14 crc kubenswrapper[4834]: E1008 22:43:14.580104 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:cbe345acb37e57986ecf6685d28c72d0e639bdb493a18e9d3ba947d6c3a16384\\\"\"" pod="openstack/barbican-db-sync-g2wdt" podUID="3d656893-3446-42fe-86ad-74e1b9d7ecd5" Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.584851 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"6163dc66da07cee67fb6457237db1afec09f5bef4082ecfbaefff77cd8dc028c"} Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.596553 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cef930bb-c211-441f-a59f-e797704ce837","Type":"ContainerStarted","Data":"d6babd5298cb2fe00e5a82ee5906f571f8aa371eb5dfe4382284a2497d7d960c"} Oct 08 22:43:14 crc kubenswrapper[4834]: I1008 22:43:14.698867 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:43:15 crc kubenswrapper[4834]: I1008 22:43:15.566370 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532b6dee-483f-40e2-a1a6-d2d9af582e97" path="/var/lib/kubelet/pods/532b6dee-483f-40e2-a1a6-d2d9af582e97/volumes" Oct 08 22:43:15 crc kubenswrapper[4834]: I1008 22:43:15.567715 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f62b1c4e-e49f-4d2f-b08c-5d4362517cf9" path="/var/lib/kubelet/pods/f62b1c4e-e49f-4d2f-b08c-5d4362517cf9/volumes" Oct 08 22:43:15 crc kubenswrapper[4834]: I1008 22:43:15.610119 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b34caaaa-9ad3-42b8-8537-876601474580","Type":"ContainerStarted","Data":"dc5e2e8417ace5c999280b76594b36eb8fbcb6af9691cc8cce6e64f0a2c22444"} Oct 08 22:43:15 crc kubenswrapper[4834]: I1008 22:43:15.610185 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b34caaaa-9ad3-42b8-8537-876601474580","Type":"ContainerStarted","Data":"653fa47fc8457b754c433323c3245942fc4829eb9dfa7977f529500398d68ad0"} Oct 08 22:43:15 crc kubenswrapper[4834]: I1008 22:43:15.612390 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cef930bb-c211-441f-a59f-e797704ce837","Type":"ContainerStarted","Data":"8e857802b4273bb3fc6019e44989d03f7abc803eb79bcd62026051a8aaa24e78"} Oct 08 22:43:16 crc kubenswrapper[4834]: I1008 22:43:16.627479 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cef930bb-c211-441f-a59f-e797704ce837","Type":"ContainerStarted","Data":"417e9a1a9853a87e4154fd1f997b54fa716f0e33a21ddc62dd44f49568347aaf"} Oct 08 22:43:17 crc kubenswrapper[4834]: I1008 22:43:17.014303 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59f45f6cf7-4gxlc" podUID="532b6dee-483f-40e2-a1a6-d2d9af582e97" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: i/o timeout" Oct 08 22:43:17 crc kubenswrapper[4834]: I1008 22:43:17.676328 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.676304997 podStartE2EDuration="4.676304997s" podCreationTimestamp="2025-10-08 22:43:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:43:17.663758883 +0000 UTC m=+1205.486643639" watchObservedRunningTime="2025-10-08 22:43:17.676304997 +0000 UTC m=+1205.499189753" Oct 08 22:43:18 crc kubenswrapper[4834]: I1008 22:43:18.652731 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b34caaaa-9ad3-42b8-8537-876601474580","Type":"ContainerStarted","Data":"3d96780e3f5bf5401fcf705807bcc64c49484592f8ee3396cf64804cc4dd5a27"} Oct 08 22:43:18 crc kubenswrapper[4834]: I1008 22:43:18.700562 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.700533206 podStartE2EDuration="5.700533206s" podCreationTimestamp="2025-10-08 22:43:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:43:18.683676728 +0000 UTC m=+1206.506561554" watchObservedRunningTime="2025-10-08 22:43:18.700533206 +0000 UTC m=+1206.523417992" Oct 08 22:43:23 crc kubenswrapper[4834]: I1008 22:43:23.829667 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 22:43:23 crc kubenswrapper[4834]: I1008 22:43:23.830523 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 22:43:23 crc kubenswrapper[4834]: I1008 22:43:23.882057 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 22:43:23 crc kubenswrapper[4834]: I1008 22:43:23.903920 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 22:43:24 crc kubenswrapper[4834]: I1008 22:43:24.141453 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 22:43:24 crc kubenswrapper[4834]: I1008 22:43:24.141534 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 22:43:24 crc kubenswrapper[4834]: I1008 22:43:24.201237 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 22:43:24 crc kubenswrapper[4834]: I1008 22:43:24.232046 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 22:43:24 crc kubenswrapper[4834]: I1008 22:43:24.717238 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 22:43:24 crc kubenswrapper[4834]: I1008 22:43:24.717904 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 22:43:24 crc kubenswrapper[4834]: I1008 22:43:24.717940 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 22:43:24 crc kubenswrapper[4834]: I1008 22:43:24.717966 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 22:43:26 crc kubenswrapper[4834]: I1008 22:43:26.621233 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 22:43:26 crc kubenswrapper[4834]: I1008 22:43:26.675947 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 22:43:26 crc kubenswrapper[4834]: I1008 22:43:26.684664 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 22:43:26 crc kubenswrapper[4834]: I1008 22:43:26.749638 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 22:43:27 crc kubenswrapper[4834]: I1008 22:43:27.647625 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 22:43:34 crc kubenswrapper[4834]: I1008 22:43:34.281930 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:43:34 crc kubenswrapper[4834]: I1008 22:43:34.991773 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 08 22:43:34 crc kubenswrapper[4834]: I1008 22:43:34.993101 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 22:43:34 crc kubenswrapper[4834]: I1008 22:43:34.995632 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 08 22:43:34 crc kubenswrapper[4834]: I1008 22:43:34.996056 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wjqz9" Oct 08 22:43:34 crc kubenswrapper[4834]: I1008 22:43:34.996798 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 08 22:43:35 crc kubenswrapper[4834]: I1008 22:43:35.013315 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 22:43:35 crc kubenswrapper[4834]: I1008 22:43:35.078131 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " pod="openstack/openstackclient" Oct 08 22:43:35 crc kubenswrapper[4834]: I1008 22:43:35.078503 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-openstack-config\") pod \"openstackclient\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " pod="openstack/openstackclient" Oct 08 22:43:35 crc kubenswrapper[4834]: I1008 22:43:35.078687 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n9ln\" (UniqueName: \"kubernetes.io/projected/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-kube-api-access-4n9ln\") pod \"openstackclient\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " pod="openstack/openstackclient" Oct 08 22:43:35 crc kubenswrapper[4834]: I1008 22:43:35.078825 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-openstack-config-secret\") pod \"openstackclient\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " pod="openstack/openstackclient" Oct 08 22:43:35 crc kubenswrapper[4834]: I1008 22:43:35.180194 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n9ln\" (UniqueName: \"kubernetes.io/projected/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-kube-api-access-4n9ln\") pod \"openstackclient\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " pod="openstack/openstackclient" Oct 08 22:43:35 crc kubenswrapper[4834]: I1008 22:43:35.180256 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-openstack-config-secret\") pod \"openstackclient\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " pod="openstack/openstackclient" Oct 08 22:43:35 crc kubenswrapper[4834]: I1008 22:43:35.180335 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " pod="openstack/openstackclient" Oct 08 22:43:35 crc kubenswrapper[4834]: I1008 22:43:35.180399 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-openstack-config\") pod \"openstackclient\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " pod="openstack/openstackclient" Oct 08 22:43:35 crc kubenswrapper[4834]: I1008 22:43:35.181466 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-openstack-config\") pod \"openstackclient\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " pod="openstack/openstackclient" Oct 08 22:43:35 crc kubenswrapper[4834]: I1008 22:43:35.187856 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-openstack-config-secret\") pod \"openstackclient\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " pod="openstack/openstackclient" Oct 08 22:43:35 crc kubenswrapper[4834]: I1008 22:43:35.189945 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " pod="openstack/openstackclient" Oct 08 22:43:35 crc kubenswrapper[4834]: I1008 22:43:35.196788 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n9ln\" (UniqueName: \"kubernetes.io/projected/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-kube-api-access-4n9ln\") pod \"openstackclient\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " pod="openstack/openstackclient" Oct 08 22:43:35 crc kubenswrapper[4834]: I1008 22:43:35.318585 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 22:43:44 crc kubenswrapper[4834]: E1008 22:43:44.824992 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f" Oct 08 22:43:44 crc kubenswrapper[4834]: E1008 22:43:44.825781 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kckgb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-449pd_openstack(15a24e03-f3be-433f-bbc1-3a25da713c65): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:43:44 crc kubenswrapper[4834]: E1008 22:43:44.827430 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-449pd" podUID="15a24e03-f3be-433f-bbc1-3a25da713c65" Oct 08 22:43:44 crc kubenswrapper[4834]: E1008 22:43:44.874616 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48" Oct 08 22:43:44 crc kubenswrapper[4834]: E1008 22:43:44.874832 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8jhm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1a479306-f4e1-49a0-9e1d-4ba54ecedf90): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 08 22:43:44 crc kubenswrapper[4834]: E1008 22:43:44.876818 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"sg-core\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="1a479306-f4e1-49a0-9e1d-4ba54ecedf90" Oct 08 22:43:44 crc kubenswrapper[4834]: I1008 22:43:44.939209 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a479306-f4e1-49a0-9e1d-4ba54ecedf90" containerName="ceilometer-notification-agent" containerID="cri-o://9a9e9002a8f5e88c573acb171e6833d4bc4a8f7a89c1f47022ef5b011f236409" gracePeriod=30 Oct 08 22:43:45 crc kubenswrapper[4834]: I1008 22:43:45.319405 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 08 22:43:45 crc kubenswrapper[4834]: W1008 22:43:45.322998 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8b7117c_11cc_4ba9_bd98_e25e6a56d8a6.slice/crio-085daee131aa45d88d4543a49fc5f8c84a78115eb6f7cdfda6180469ec2ec730 WatchSource:0}: Error finding container 085daee131aa45d88d4543a49fc5f8c84a78115eb6f7cdfda6180469ec2ec730: Status 404 returned error can't find the container with id 085daee131aa45d88d4543a49fc5f8c84a78115eb6f7cdfda6180469ec2ec730 Oct 08 22:43:45 crc kubenswrapper[4834]: I1008 22:43:45.948672 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6","Type":"ContainerStarted","Data":"085daee131aa45d88d4543a49fc5f8c84a78115eb6f7cdfda6180469ec2ec730"} Oct 08 22:43:45 crc kubenswrapper[4834]: I1008 22:43:45.950583 4834 generic.go:334] "Generic (PLEG): container finished" podID="53c0210a-93d4-4f54-a542-d69c77229b9e" containerID="43fb96c167efdaaa1299cb06ffba0aaaeaff5f3e9c0bac460dec7ea52e491fdd" exitCode=0 Oct 08 22:43:45 crc kubenswrapper[4834]: I1008 22:43:45.950637 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mhj54" event={"ID":"53c0210a-93d4-4f54-a542-d69c77229b9e","Type":"ContainerDied","Data":"43fb96c167efdaaa1299cb06ffba0aaaeaff5f3e9c0bac460dec7ea52e491fdd"} Oct 08 22:43:45 crc kubenswrapper[4834]: I1008 22:43:45.953498 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g2wdt" event={"ID":"3d656893-3446-42fe-86ad-74e1b9d7ecd5","Type":"ContainerStarted","Data":"fa460528d511bb58346a4686bdd10ae400531ea0b52fc947a827453536728a0c"} Oct 08 22:43:45 crc kubenswrapper[4834]: I1008 22:43:45.986241 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-g2wdt" podStartSLOduration=2.629207543 podStartE2EDuration="1m33.986222939s" podCreationTimestamp="2025-10-08 22:42:12 +0000 UTC" firstStartedPulling="2025-10-08 22:42:13.669782951 +0000 UTC m=+1141.492667697" lastFinishedPulling="2025-10-08 22:43:45.026798327 +0000 UTC m=+1232.849683093" observedRunningTime="2025-10-08 22:43:45.985454851 +0000 UTC m=+1233.808339607" watchObservedRunningTime="2025-10-08 22:43:45.986222939 +0000 UTC m=+1233.809107705" Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.320996 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mhj54" Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.415424 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-config-data\") pod \"53c0210a-93d4-4f54-a542-d69c77229b9e\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.415489 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhb9x\" (UniqueName: \"kubernetes.io/projected/53c0210a-93d4-4f54-a542-d69c77229b9e-kube-api-access-zhb9x\") pod \"53c0210a-93d4-4f54-a542-d69c77229b9e\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.415522 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53c0210a-93d4-4f54-a542-d69c77229b9e-logs\") pod \"53c0210a-93d4-4f54-a542-d69c77229b9e\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.415555 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-combined-ca-bundle\") pod \"53c0210a-93d4-4f54-a542-d69c77229b9e\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.415632 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-scripts\") pod \"53c0210a-93d4-4f54-a542-d69c77229b9e\" (UID: \"53c0210a-93d4-4f54-a542-d69c77229b9e\") " Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.415938 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53c0210a-93d4-4f54-a542-d69c77229b9e-logs" (OuterVolumeSpecName: "logs") pod "53c0210a-93d4-4f54-a542-d69c77229b9e" (UID: "53c0210a-93d4-4f54-a542-d69c77229b9e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.424314 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c0210a-93d4-4f54-a542-d69c77229b9e-kube-api-access-zhb9x" (OuterVolumeSpecName: "kube-api-access-zhb9x") pod "53c0210a-93d4-4f54-a542-d69c77229b9e" (UID: "53c0210a-93d4-4f54-a542-d69c77229b9e"). InnerVolumeSpecName "kube-api-access-zhb9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.428313 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-scripts" (OuterVolumeSpecName: "scripts") pod "53c0210a-93d4-4f54-a542-d69c77229b9e" (UID: "53c0210a-93d4-4f54-a542-d69c77229b9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.449491 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-config-data" (OuterVolumeSpecName: "config-data") pod "53c0210a-93d4-4f54-a542-d69c77229b9e" (UID: "53c0210a-93d4-4f54-a542-d69c77229b9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.467359 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53c0210a-93d4-4f54-a542-d69c77229b9e" (UID: "53c0210a-93d4-4f54-a542-d69c77229b9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.517268 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhb9x\" (UniqueName: \"kubernetes.io/projected/53c0210a-93d4-4f54-a542-d69c77229b9e-kube-api-access-zhb9x\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.517305 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53c0210a-93d4-4f54-a542-d69c77229b9e-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.517318 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.517329 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.517340 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c0210a-93d4-4f54-a542-d69c77229b9e-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.973541 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mhj54" event={"ID":"53c0210a-93d4-4f54-a542-d69c77229b9e","Type":"ContainerDied","Data":"bd0c7e56f0ae893606dcf3e6ae1ea8d2a655d462af51b53dc46ef4defb99b418"} Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.973589 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd0c7e56f0ae893606dcf3e6ae1ea8d2a655d462af51b53dc46ef4defb99b418" Oct 08 22:43:47 crc kubenswrapper[4834]: I1008 22:43:47.973642 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mhj54" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.063290 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-55b44744c4-z2p4d"] Oct 08 22:43:48 crc kubenswrapper[4834]: E1008 22:43:48.064101 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c0210a-93d4-4f54-a542-d69c77229b9e" containerName="placement-db-sync" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.064122 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c0210a-93d4-4f54-a542-d69c77229b9e" containerName="placement-db-sync" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.064396 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c0210a-93d4-4f54-a542-d69c77229b9e" containerName="placement-db-sync" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.065453 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.080708 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55b44744c4-z2p4d"] Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.081639 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.081854 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.081971 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.082093 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.082542 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rgzrw" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.128526 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5d01ab-b923-4829-9b10-6ad9010216eb-logs\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.128582 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-internal-tls-certs\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.128633 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-public-tls-certs\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.128657 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmrv2\" (UniqueName: \"kubernetes.io/projected/6c5d01ab-b923-4829-9b10-6ad9010216eb-kube-api-access-tmrv2\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.128680 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-config-data\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.128728 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-scripts\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.128754 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-combined-ca-bundle\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.230182 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5d01ab-b923-4829-9b10-6ad9010216eb-logs\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.230230 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-internal-tls-certs\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.230276 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-public-tls-certs\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.230297 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmrv2\" (UniqueName: \"kubernetes.io/projected/6c5d01ab-b923-4829-9b10-6ad9010216eb-kube-api-access-tmrv2\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.230315 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-config-data\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.230348 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-scripts\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.230366 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-combined-ca-bundle\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.231383 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5d01ab-b923-4829-9b10-6ad9010216eb-logs\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.238544 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-config-data\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.240500 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-scripts\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.245708 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-combined-ca-bundle\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.248737 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-public-tls-certs\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.249331 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-internal-tls-certs\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.256800 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmrv2\" (UniqueName: \"kubernetes.io/projected/6c5d01ab-b923-4829-9b10-6ad9010216eb-kube-api-access-tmrv2\") pod \"placement-55b44744c4-z2p4d\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.415479 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.870254 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55b44744c4-z2p4d"] Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.986064 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.987199 4834 generic.go:334] "Generic (PLEG): container finished" podID="1a479306-f4e1-49a0-9e1d-4ba54ecedf90" containerID="9a9e9002a8f5e88c573acb171e6833d4bc4a8f7a89c1f47022ef5b011f236409" exitCode=0 Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.987271 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a479306-f4e1-49a0-9e1d-4ba54ecedf90","Type":"ContainerDied","Data":"9a9e9002a8f5e88c573acb171e6833d4bc4a8f7a89c1f47022ef5b011f236409"} Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.987306 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a479306-f4e1-49a0-9e1d-4ba54ecedf90","Type":"ContainerDied","Data":"30f6c464fee03a17a900485019a5cc51474853e708ea0887c1fe8f6c31a22c76"} Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.987323 4834 scope.go:117] "RemoveContainer" containerID="9a9e9002a8f5e88c573acb171e6833d4bc4a8f7a89c1f47022ef5b011f236409" Oct 08 22:43:48 crc kubenswrapper[4834]: I1008 22:43:48.994321 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55b44744c4-z2p4d" event={"ID":"6c5d01ab-b923-4829-9b10-6ad9010216eb","Type":"ContainerStarted","Data":"251cc17d71ffcffa0fb855a97386069ce927a2207f2187b4c6f87376e8ed259c"} Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.020688 4834 scope.go:117] "RemoveContainer" containerID="9a9e9002a8f5e88c573acb171e6833d4bc4a8f7a89c1f47022ef5b011f236409" Oct 08 22:43:49 crc kubenswrapper[4834]: E1008 22:43:49.021562 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a9e9002a8f5e88c573acb171e6833d4bc4a8f7a89c1f47022ef5b011f236409\": container with ID starting with 9a9e9002a8f5e88c573acb171e6833d4bc4a8f7a89c1f47022ef5b011f236409 not found: ID does not exist" containerID="9a9e9002a8f5e88c573acb171e6833d4bc4a8f7a89c1f47022ef5b011f236409" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.021596 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a9e9002a8f5e88c573acb171e6833d4bc4a8f7a89c1f47022ef5b011f236409"} err="failed to get container status \"9a9e9002a8f5e88c573acb171e6833d4bc4a8f7a89c1f47022ef5b011f236409\": rpc error: code = NotFound desc = could not find container \"9a9e9002a8f5e88c573acb171e6833d4bc4a8f7a89c1f47022ef5b011f236409\": container with ID starting with 9a9e9002a8f5e88c573acb171e6833d4bc4a8f7a89c1f47022ef5b011f236409 not found: ID does not exist" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.045090 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-sg-core-conf-yaml\") pod \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.045146 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-scripts\") pod \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.045202 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-log-httpd\") pod \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.045252 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-combined-ca-bundle\") pod \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.045274 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8jhm\" (UniqueName: \"kubernetes.io/projected/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-kube-api-access-r8jhm\") pod \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.045303 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-run-httpd\") pod \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.045333 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-config-data\") pod \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\" (UID: \"1a479306-f4e1-49a0-9e1d-4ba54ecedf90\") " Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.046070 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1a479306-f4e1-49a0-9e1d-4ba54ecedf90" (UID: "1a479306-f4e1-49a0-9e1d-4ba54ecedf90"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.047003 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1a479306-f4e1-49a0-9e1d-4ba54ecedf90" (UID: "1a479306-f4e1-49a0-9e1d-4ba54ecedf90"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.049369 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-scripts" (OuterVolumeSpecName: "scripts") pod "1a479306-f4e1-49a0-9e1d-4ba54ecedf90" (UID: "1a479306-f4e1-49a0-9e1d-4ba54ecedf90"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.050592 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-kube-api-access-r8jhm" (OuterVolumeSpecName: "kube-api-access-r8jhm") pod "1a479306-f4e1-49a0-9e1d-4ba54ecedf90" (UID: "1a479306-f4e1-49a0-9e1d-4ba54ecedf90"). InnerVolumeSpecName "kube-api-access-r8jhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.051062 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1a479306-f4e1-49a0-9e1d-4ba54ecedf90" (UID: "1a479306-f4e1-49a0-9e1d-4ba54ecedf90"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.076465 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-config-data" (OuterVolumeSpecName: "config-data") pod "1a479306-f4e1-49a0-9e1d-4ba54ecedf90" (UID: "1a479306-f4e1-49a0-9e1d-4ba54ecedf90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.082614 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a479306-f4e1-49a0-9e1d-4ba54ecedf90" (UID: "1a479306-f4e1-49a0-9e1d-4ba54ecedf90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.147283 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.147318 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.147332 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.147343 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.147355 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8jhm\" (UniqueName: \"kubernetes.io/projected/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-kube-api-access-r8jhm\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.147368 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.147378 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a479306-f4e1-49a0-9e1d-4ba54ecedf90-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.414320 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-69f6cbfd5c-82mhn"] Oct 08 22:43:49 crc kubenswrapper[4834]: E1008 22:43:49.414722 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a479306-f4e1-49a0-9e1d-4ba54ecedf90" containerName="ceilometer-notification-agent" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.414744 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a479306-f4e1-49a0-9e1d-4ba54ecedf90" containerName="ceilometer-notification-agent" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.414965 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a479306-f4e1-49a0-9e1d-4ba54ecedf90" containerName="ceilometer-notification-agent" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.416072 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.418691 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.418893 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.419151 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.432855 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-69f6cbfd5c-82mhn"] Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.556598 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-internal-tls-certs\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.556968 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-public-tls-certs\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.556994 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-combined-ca-bundle\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.557039 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5aa1aef-afe2-4b70-9033-c62921f3d106-log-httpd\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.557107 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z64xb\" (UniqueName: \"kubernetes.io/projected/c5aa1aef-afe2-4b70-9033-c62921f3d106-kube-api-access-z64xb\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.557129 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5aa1aef-afe2-4b70-9033-c62921f3d106-run-httpd\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.557178 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-config-data\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.557348 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c5aa1aef-afe2-4b70-9033-c62921f3d106-etc-swift\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.659319 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-internal-tls-certs\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.659379 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-public-tls-certs\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.659404 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-combined-ca-bundle\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.659419 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5aa1aef-afe2-4b70-9033-c62921f3d106-log-httpd\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.659460 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z64xb\" (UniqueName: \"kubernetes.io/projected/c5aa1aef-afe2-4b70-9033-c62921f3d106-kube-api-access-z64xb\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.659476 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5aa1aef-afe2-4b70-9033-c62921f3d106-run-httpd\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.659496 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-config-data\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.659554 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c5aa1aef-afe2-4b70-9033-c62921f3d106-etc-swift\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.659984 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5aa1aef-afe2-4b70-9033-c62921f3d106-log-httpd\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.660627 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5aa1aef-afe2-4b70-9033-c62921f3d106-run-httpd\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.665408 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-public-tls-certs\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.665892 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-internal-tls-certs\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.666863 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-config-data\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.680224 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-combined-ca-bundle\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.680500 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c5aa1aef-afe2-4b70-9033-c62921f3d106-etc-swift\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.687290 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z64xb\" (UniqueName: \"kubernetes.io/projected/c5aa1aef-afe2-4b70-9033-c62921f3d106-kube-api-access-z64xb\") pod \"swift-proxy-69f6cbfd5c-82mhn\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:49 crc kubenswrapper[4834]: I1008 22:43:49.734116 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.004482 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.015386 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55b44744c4-z2p4d" event={"ID":"6c5d01ab-b923-4829-9b10-6ad9010216eb","Type":"ContainerStarted","Data":"1057d7b178b738e910bc1a7a9841be19ad4ff8504d2eb4e2c8e8ff4bf273e1f0"} Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.015431 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55b44744c4-z2p4d" event={"ID":"6c5d01ab-b923-4829-9b10-6ad9010216eb","Type":"ContainerStarted","Data":"8572778b6d4762619545e6de6bc9dc967110b451eb1a7a8a406e0ababce1ccb3"} Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.016267 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.016286 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.070239 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.086256 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.098559 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.101999 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-55b44744c4-z2p4d" podStartSLOduration=2.101975522 podStartE2EDuration="2.101975522s" podCreationTimestamp="2025-10-08 22:43:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:43:50.089208652 +0000 UTC m=+1237.912093408" watchObservedRunningTime="2025-10-08 22:43:50.101975522 +0000 UTC m=+1237.924860268" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.103200 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.107246 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.107345 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.139518 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.170354 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.170419 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389aef32-a13f-4210-889b-5f2eb59fae52-run-httpd\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.170460 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-scripts\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.170540 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389aef32-a13f-4210-889b-5f2eb59fae52-log-httpd\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.170571 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-config-data\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.170639 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgxzn\" (UniqueName: \"kubernetes.io/projected/389aef32-a13f-4210-889b-5f2eb59fae52-kube-api-access-jgxzn\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.170775 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.272834 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389aef32-a13f-4210-889b-5f2eb59fae52-log-httpd\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.272893 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-config-data\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.272972 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgxzn\" (UniqueName: \"kubernetes.io/projected/389aef32-a13f-4210-889b-5f2eb59fae52-kube-api-access-jgxzn\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.273080 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.273114 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.273163 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389aef32-a13f-4210-889b-5f2eb59fae52-run-httpd\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.273210 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-scripts\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.274009 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389aef32-a13f-4210-889b-5f2eb59fae52-log-httpd\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.274027 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389aef32-a13f-4210-889b-5f2eb59fae52-run-httpd\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.286807 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.287428 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-scripts\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.287658 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-config-data\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.288317 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.294415 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgxzn\" (UniqueName: \"kubernetes.io/projected/389aef32-a13f-4210-889b-5f2eb59fae52-kube-api-access-jgxzn\") pod \"ceilometer-0\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.431920 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.507866 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.534718 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-69f6cbfd5c-82mhn"] Oct 08 22:43:50 crc kubenswrapper[4834]: I1008 22:43:50.718899 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.026136 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" event={"ID":"c5aa1aef-afe2-4b70-9033-c62921f3d106","Type":"ContainerStarted","Data":"8178bb5c63d59751fa04e9a8611028105cb6a3f042c30ae51e01c44207c48306"} Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.026918 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" event={"ID":"c5aa1aef-afe2-4b70-9033-c62921f3d106","Type":"ContainerStarted","Data":"a42c2b464216c2c57060cfdd7711c877ae77f5b4d9f363897d48919f5a6b5ef8"} Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.029286 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389aef32-a13f-4210-889b-5f2eb59fae52","Type":"ContainerStarted","Data":"4fda5c44add7e2a9493025b1b4228509337dd0e13490570874aae1b21fe93f26"} Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.552315 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hkfnr"] Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.554232 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hkfnr" Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.571348 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a479306-f4e1-49a0-9e1d-4ba54ecedf90" path="/var/lib/kubelet/pods/1a479306-f4e1-49a0-9e1d-4ba54ecedf90/volumes" Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.572251 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hkfnr"] Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.622407 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjr89\" (UniqueName: \"kubernetes.io/projected/4adf21b4-b07b-41df-b887-29580e96f8b9-kube-api-access-jjr89\") pod \"nova-api-db-create-hkfnr\" (UID: \"4adf21b4-b07b-41df-b887-29580e96f8b9\") " pod="openstack/nova-api-db-create-hkfnr" Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.656706 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-rgzpk"] Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.658286 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rgzpk" Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.667782 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rgzpk"] Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.723527 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjr89\" (UniqueName: \"kubernetes.io/projected/4adf21b4-b07b-41df-b887-29580e96f8b9-kube-api-access-jjr89\") pod \"nova-api-db-create-hkfnr\" (UID: \"4adf21b4-b07b-41df-b887-29580e96f8b9\") " pod="openstack/nova-api-db-create-hkfnr" Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.723566 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggqc4\" (UniqueName: \"kubernetes.io/projected/9d1ee346-0de8-44e6-a240-05669af5a41e-kube-api-access-ggqc4\") pod \"nova-cell0-db-create-rgzpk\" (UID: \"9d1ee346-0de8-44e6-a240-05669af5a41e\") " pod="openstack/nova-cell0-db-create-rgzpk" Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.754786 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hk2mh"] Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.755997 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hk2mh" Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.770099 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hk2mh"] Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.825008 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxzjp\" (UniqueName: \"kubernetes.io/projected/c87464d1-1c42-43d1-b273-e25acf2895cd-kube-api-access-zxzjp\") pod \"nova-cell1-db-create-hk2mh\" (UID: \"c87464d1-1c42-43d1-b273-e25acf2895cd\") " pod="openstack/nova-cell1-db-create-hk2mh" Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.825137 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggqc4\" (UniqueName: \"kubernetes.io/projected/9d1ee346-0de8-44e6-a240-05669af5a41e-kube-api-access-ggqc4\") pod \"nova-cell0-db-create-rgzpk\" (UID: \"9d1ee346-0de8-44e6-a240-05669af5a41e\") " pod="openstack/nova-cell0-db-create-rgzpk" Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.825666 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjr89\" (UniqueName: \"kubernetes.io/projected/4adf21b4-b07b-41df-b887-29580e96f8b9-kube-api-access-jjr89\") pod \"nova-api-db-create-hkfnr\" (UID: \"4adf21b4-b07b-41df-b887-29580e96f8b9\") " pod="openstack/nova-api-db-create-hkfnr" Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.849854 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggqc4\" (UniqueName: \"kubernetes.io/projected/9d1ee346-0de8-44e6-a240-05669af5a41e-kube-api-access-ggqc4\") pod \"nova-cell0-db-create-rgzpk\" (UID: \"9d1ee346-0de8-44e6-a240-05669af5a41e\") " pod="openstack/nova-cell0-db-create-rgzpk" Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.875026 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hkfnr" Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.927019 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxzjp\" (UniqueName: \"kubernetes.io/projected/c87464d1-1c42-43d1-b273-e25acf2895cd-kube-api-access-zxzjp\") pod \"nova-cell1-db-create-hk2mh\" (UID: \"c87464d1-1c42-43d1-b273-e25acf2895cd\") " pod="openstack/nova-cell1-db-create-hk2mh" Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.952310 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxzjp\" (UniqueName: \"kubernetes.io/projected/c87464d1-1c42-43d1-b273-e25acf2895cd-kube-api-access-zxzjp\") pod \"nova-cell1-db-create-hk2mh\" (UID: \"c87464d1-1c42-43d1-b273-e25acf2895cd\") " pod="openstack/nova-cell1-db-create-hk2mh" Oct 08 22:43:51 crc kubenswrapper[4834]: I1008 22:43:51.990146 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rgzpk" Oct 08 22:43:52 crc kubenswrapper[4834]: I1008 22:43:52.046301 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" event={"ID":"c5aa1aef-afe2-4b70-9033-c62921f3d106","Type":"ContainerStarted","Data":"62ac2f468ed7a8ca1ccfd138476149a4798728df3269c8a1e691fb31a153f440"} Oct 08 22:43:52 crc kubenswrapper[4834]: I1008 22:43:52.047966 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:52 crc kubenswrapper[4834]: I1008 22:43:52.048012 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:52 crc kubenswrapper[4834]: I1008 22:43:52.073603 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" podStartSLOduration=3.073587175 podStartE2EDuration="3.073587175s" podCreationTimestamp="2025-10-08 22:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:43:52.071201047 +0000 UTC m=+1239.894085803" watchObservedRunningTime="2025-10-08 22:43:52.073587175 +0000 UTC m=+1239.896471921" Oct 08 22:43:52 crc kubenswrapper[4834]: I1008 22:43:52.118579 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hk2mh" Oct 08 22:43:57 crc kubenswrapper[4834]: E1008 22:43:57.563672 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f\\\"\"" pod="openstack/cinder-db-sync-449pd" podUID="15a24e03-f3be-433f-bbc1-3a25da713c65" Oct 08 22:43:59 crc kubenswrapper[4834]: I1008 22:43:59.742094 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:43:59 crc kubenswrapper[4834]: I1008 22:43:59.748483 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:44:05 crc kubenswrapper[4834]: E1008 22:44:05.650441 4834 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:b8bff6857fec93c3c1521f1a8c23de21bcb86fc0f960972e81f6c3f95d4185be" Oct 08 22:44:05 crc kubenswrapper[4834]: E1008 22:44:05.651072 4834 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:b8bff6857fec93c3c1521f1a8c23de21bcb86fc0f960972e81f6c3f95d4185be,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n54h98h557hfch56fh67bh6bh74h684h76h578h5cbh547h566h666h654h545h644h697h559h5cfh559h5d4h5fbh56fh5h68ch64h5fbh97h598h677q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4n9ln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 08 22:44:05 crc kubenswrapper[4834]: E1008 22:44:05.652522 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6" Oct 08 22:44:06 crc kubenswrapper[4834]: I1008 22:44:06.021500 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hkfnr"] Oct 08 22:44:06 crc kubenswrapper[4834]: W1008 22:44:06.024896 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4adf21b4_b07b_41df_b887_29580e96f8b9.slice/crio-daf635e30c1d80b7d4f4b9e3fb6385aa2d796e785bb80ec3cdb2cc00cd1f977c WatchSource:0}: Error finding container daf635e30c1d80b7d4f4b9e3fb6385aa2d796e785bb80ec3cdb2cc00cd1f977c: Status 404 returned error can't find the container with id daf635e30c1d80b7d4f4b9e3fb6385aa2d796e785bb80ec3cdb2cc00cd1f977c Oct 08 22:44:06 crc kubenswrapper[4834]: W1008 22:44:06.141975 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc87464d1_1c42_43d1_b273_e25acf2895cd.slice/crio-f2769f05b489ecae4c9bc99497ce5b48737efa2dad89d5fd38bf4be7a0020c73 WatchSource:0}: Error finding container f2769f05b489ecae4c9bc99497ce5b48737efa2dad89d5fd38bf4be7a0020c73: Status 404 returned error can't find the container with id f2769f05b489ecae4c9bc99497ce5b48737efa2dad89d5fd38bf4be7a0020c73 Oct 08 22:44:06 crc kubenswrapper[4834]: I1008 22:44:06.151061 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hk2mh"] Oct 08 22:44:06 crc kubenswrapper[4834]: I1008 22:44:06.158703 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rgzpk"] Oct 08 22:44:06 crc kubenswrapper[4834]: W1008 22:44:06.166567 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d1ee346_0de8_44e6_a240_05669af5a41e.slice/crio-ac667bff86d07a35ad1e6262d6f31a437ab06aa0716ae36dcd7fa5d18e86fb84 WatchSource:0}: Error finding container ac667bff86d07a35ad1e6262d6f31a437ab06aa0716ae36dcd7fa5d18e86fb84: Status 404 returned error can't find the container with id ac667bff86d07a35ad1e6262d6f31a437ab06aa0716ae36dcd7fa5d18e86fb84 Oct 08 22:44:06 crc kubenswrapper[4834]: I1008 22:44:06.196590 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389aef32-a13f-4210-889b-5f2eb59fae52","Type":"ContainerStarted","Data":"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db"} Oct 08 22:44:06 crc kubenswrapper[4834]: I1008 22:44:06.198375 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rgzpk" event={"ID":"9d1ee346-0de8-44e6-a240-05669af5a41e","Type":"ContainerStarted","Data":"ac667bff86d07a35ad1e6262d6f31a437ab06aa0716ae36dcd7fa5d18e86fb84"} Oct 08 22:44:06 crc kubenswrapper[4834]: I1008 22:44:06.201949 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hkfnr" event={"ID":"4adf21b4-b07b-41df-b887-29580e96f8b9","Type":"ContainerStarted","Data":"4aa0ba726ca1aa64b7eeea2ec8352bfcf0ba2a169332fa4c550d65f578ecb4f3"} Oct 08 22:44:06 crc kubenswrapper[4834]: I1008 22:44:06.201991 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hkfnr" event={"ID":"4adf21b4-b07b-41df-b887-29580e96f8b9","Type":"ContainerStarted","Data":"daf635e30c1d80b7d4f4b9e3fb6385aa2d796e785bb80ec3cdb2cc00cd1f977c"} Oct 08 22:44:06 crc kubenswrapper[4834]: I1008 22:44:06.204280 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hk2mh" event={"ID":"c87464d1-1c42-43d1-b273-e25acf2895cd","Type":"ContainerStarted","Data":"f2769f05b489ecae4c9bc99497ce5b48737efa2dad89d5fd38bf4be7a0020c73"} Oct 08 22:44:06 crc kubenswrapper[4834]: E1008 22:44:06.213134 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:b8bff6857fec93c3c1521f1a8c23de21bcb86fc0f960972e81f6c3f95d4185be\\\"\"" pod="openstack/openstackclient" podUID="b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6" Oct 08 22:44:06 crc kubenswrapper[4834]: I1008 22:44:06.219744 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-hkfnr" podStartSLOduration=15.219727075 podStartE2EDuration="15.219727075s" podCreationTimestamp="2025-10-08 22:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:44:06.217289966 +0000 UTC m=+1254.040174712" watchObservedRunningTime="2025-10-08 22:44:06.219727075 +0000 UTC m=+1254.042611821" Oct 08 22:44:07 crc kubenswrapper[4834]: I1008 22:44:07.215204 4834 generic.go:334] "Generic (PLEG): container finished" podID="c87464d1-1c42-43d1-b273-e25acf2895cd" containerID="99327c7137c459ab6c1a2243cfe4b0f0f55a62c5f12e27cb1eae90bc24efe0be" exitCode=0 Oct 08 22:44:07 crc kubenswrapper[4834]: I1008 22:44:07.215690 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hk2mh" event={"ID":"c87464d1-1c42-43d1-b273-e25acf2895cd","Type":"ContainerDied","Data":"99327c7137c459ab6c1a2243cfe4b0f0f55a62c5f12e27cb1eae90bc24efe0be"} Oct 08 22:44:07 crc kubenswrapper[4834]: I1008 22:44:07.218061 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389aef32-a13f-4210-889b-5f2eb59fae52","Type":"ContainerStarted","Data":"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f"} Oct 08 22:44:07 crc kubenswrapper[4834]: I1008 22:44:07.219727 4834 generic.go:334] "Generic (PLEG): container finished" podID="9d1ee346-0de8-44e6-a240-05669af5a41e" containerID="00a28d17edcc7b5ed388c8fc1b886f0caf34114f3bd6fb165a2bcc569450f50d" exitCode=0 Oct 08 22:44:07 crc kubenswrapper[4834]: I1008 22:44:07.219773 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rgzpk" event={"ID":"9d1ee346-0de8-44e6-a240-05669af5a41e","Type":"ContainerDied","Data":"00a28d17edcc7b5ed388c8fc1b886f0caf34114f3bd6fb165a2bcc569450f50d"} Oct 08 22:44:07 crc kubenswrapper[4834]: I1008 22:44:07.222406 4834 generic.go:334] "Generic (PLEG): container finished" podID="4adf21b4-b07b-41df-b887-29580e96f8b9" containerID="4aa0ba726ca1aa64b7eeea2ec8352bfcf0ba2a169332fa4c550d65f578ecb4f3" exitCode=0 Oct 08 22:44:07 crc kubenswrapper[4834]: I1008 22:44:07.222458 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hkfnr" event={"ID":"4adf21b4-b07b-41df-b887-29580e96f8b9","Type":"ContainerDied","Data":"4aa0ba726ca1aa64b7eeea2ec8352bfcf0ba2a169332fa4c550d65f578ecb4f3"} Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.233574 4834 generic.go:334] "Generic (PLEG): container finished" podID="3d656893-3446-42fe-86ad-74e1b9d7ecd5" containerID="fa460528d511bb58346a4686bdd10ae400531ea0b52fc947a827453536728a0c" exitCode=0 Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.233999 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g2wdt" event={"ID":"3d656893-3446-42fe-86ad-74e1b9d7ecd5","Type":"ContainerDied","Data":"fa460528d511bb58346a4686bdd10ae400531ea0b52fc947a827453536728a0c"} Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.238058 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389aef32-a13f-4210-889b-5f2eb59fae52","Type":"ContainerStarted","Data":"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5"} Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.747781 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rgzpk" Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.752490 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hk2mh" Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.758763 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hkfnr" Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.852246 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxzjp\" (UniqueName: \"kubernetes.io/projected/c87464d1-1c42-43d1-b273-e25acf2895cd-kube-api-access-zxzjp\") pod \"c87464d1-1c42-43d1-b273-e25acf2895cd\" (UID: \"c87464d1-1c42-43d1-b273-e25acf2895cd\") " Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.852301 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggqc4\" (UniqueName: \"kubernetes.io/projected/9d1ee346-0de8-44e6-a240-05669af5a41e-kube-api-access-ggqc4\") pod \"9d1ee346-0de8-44e6-a240-05669af5a41e\" (UID: \"9d1ee346-0de8-44e6-a240-05669af5a41e\") " Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.852465 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjr89\" (UniqueName: \"kubernetes.io/projected/4adf21b4-b07b-41df-b887-29580e96f8b9-kube-api-access-jjr89\") pod \"4adf21b4-b07b-41df-b887-29580e96f8b9\" (UID: \"4adf21b4-b07b-41df-b887-29580e96f8b9\") " Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.859704 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4adf21b4-b07b-41df-b887-29580e96f8b9-kube-api-access-jjr89" (OuterVolumeSpecName: "kube-api-access-jjr89") pod "4adf21b4-b07b-41df-b887-29580e96f8b9" (UID: "4adf21b4-b07b-41df-b887-29580e96f8b9"). InnerVolumeSpecName "kube-api-access-jjr89". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.870135 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c87464d1-1c42-43d1-b273-e25acf2895cd-kube-api-access-zxzjp" (OuterVolumeSpecName: "kube-api-access-zxzjp") pod "c87464d1-1c42-43d1-b273-e25acf2895cd" (UID: "c87464d1-1c42-43d1-b273-e25acf2895cd"). InnerVolumeSpecName "kube-api-access-zxzjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.870598 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d1ee346-0de8-44e6-a240-05669af5a41e-kube-api-access-ggqc4" (OuterVolumeSpecName: "kube-api-access-ggqc4") pod "9d1ee346-0de8-44e6-a240-05669af5a41e" (UID: "9d1ee346-0de8-44e6-a240-05669af5a41e"). InnerVolumeSpecName "kube-api-access-ggqc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.954182 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjr89\" (UniqueName: \"kubernetes.io/projected/4adf21b4-b07b-41df-b887-29580e96f8b9-kube-api-access-jjr89\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.954254 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxzjp\" (UniqueName: \"kubernetes.io/projected/c87464d1-1c42-43d1-b273-e25acf2895cd-kube-api-access-zxzjp\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:08 crc kubenswrapper[4834]: I1008 22:44:08.954327 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggqc4\" (UniqueName: \"kubernetes.io/projected/9d1ee346-0de8-44e6-a240-05669af5a41e-kube-api-access-ggqc4\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.260485 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rgzpk" event={"ID":"9d1ee346-0de8-44e6-a240-05669af5a41e","Type":"ContainerDied","Data":"ac667bff86d07a35ad1e6262d6f31a437ab06aa0716ae36dcd7fa5d18e86fb84"} Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.260835 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac667bff86d07a35ad1e6262d6f31a437ab06aa0716ae36dcd7fa5d18e86fb84" Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.260902 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rgzpk" Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.271999 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hkfnr" event={"ID":"4adf21b4-b07b-41df-b887-29580e96f8b9","Type":"ContainerDied","Data":"daf635e30c1d80b7d4f4b9e3fb6385aa2d796e785bb80ec3cdb2cc00cd1f977c"} Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.272027 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hkfnr" Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.272045 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="daf635e30c1d80b7d4f4b9e3fb6385aa2d796e785bb80ec3cdb2cc00cd1f977c" Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.279972 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hk2mh" Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.280202 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hk2mh" event={"ID":"c87464d1-1c42-43d1-b273-e25acf2895cd","Type":"ContainerDied","Data":"f2769f05b489ecae4c9bc99497ce5b48737efa2dad89d5fd38bf4be7a0020c73"} Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.280235 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2769f05b489ecae4c9bc99497ce5b48737efa2dad89d5fd38bf4be7a0020c73" Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.838112 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g2wdt" Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.869814 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d656893-3446-42fe-86ad-74e1b9d7ecd5-combined-ca-bundle\") pod \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\" (UID: \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\") " Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.869869 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d656893-3446-42fe-86ad-74e1b9d7ecd5-db-sync-config-data\") pod \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\" (UID: \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\") " Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.869985 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b87xv\" (UniqueName: \"kubernetes.io/projected/3d656893-3446-42fe-86ad-74e1b9d7ecd5-kube-api-access-b87xv\") pod \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\" (UID: \"3d656893-3446-42fe-86ad-74e1b9d7ecd5\") " Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.875977 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d656893-3446-42fe-86ad-74e1b9d7ecd5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3d656893-3446-42fe-86ad-74e1b9d7ecd5" (UID: "3d656893-3446-42fe-86ad-74e1b9d7ecd5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.878390 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d656893-3446-42fe-86ad-74e1b9d7ecd5-kube-api-access-b87xv" (OuterVolumeSpecName: "kube-api-access-b87xv") pod "3d656893-3446-42fe-86ad-74e1b9d7ecd5" (UID: "3d656893-3446-42fe-86ad-74e1b9d7ecd5"). InnerVolumeSpecName "kube-api-access-b87xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.900218 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d656893-3446-42fe-86ad-74e1b9d7ecd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d656893-3446-42fe-86ad-74e1b9d7ecd5" (UID: "3d656893-3446-42fe-86ad-74e1b9d7ecd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.972364 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d656893-3446-42fe-86ad-74e1b9d7ecd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.972571 4834 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d656893-3446-42fe-86ad-74e1b9d7ecd5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:09 crc kubenswrapper[4834]: I1008 22:44:09.972660 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b87xv\" (UniqueName: \"kubernetes.io/projected/3d656893-3446-42fe-86ad-74e1b9d7ecd5-kube-api-access-b87xv\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.299088 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389aef32-a13f-4210-889b-5f2eb59fae52","Type":"ContainerStarted","Data":"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9"} Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.299265 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="ceilometer-central-agent" containerID="cri-o://0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db" gracePeriod=30 Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.299304 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.299410 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="proxy-httpd" containerID="cri-o://6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9" gracePeriod=30 Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.299486 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="sg-core" containerID="cri-o://1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5" gracePeriod=30 Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.299555 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="ceilometer-notification-agent" containerID="cri-o://84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f" gracePeriod=30 Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.305130 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g2wdt" event={"ID":"3d656893-3446-42fe-86ad-74e1b9d7ecd5","Type":"ContainerDied","Data":"3bb8c8a2b1d0e2f962a924181fb3fd0e255c9d81e6f82a1143ede0f82de012b9"} Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.305188 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bb8c8a2b1d0e2f962a924181fb3fd0e255c9d81e6f82a1143ede0f82de012b9" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.305262 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g2wdt" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.498820 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.182722152 podStartE2EDuration="20.498801339s" podCreationTimestamp="2025-10-08 22:43:50 +0000 UTC" firstStartedPulling="2025-10-08 22:43:50.778330177 +0000 UTC m=+1238.601214923" lastFinishedPulling="2025-10-08 22:44:09.094409364 +0000 UTC m=+1256.917294110" observedRunningTime="2025-10-08 22:44:10.334439551 +0000 UTC m=+1258.157324307" watchObservedRunningTime="2025-10-08 22:44:10.498801339 +0000 UTC m=+1258.321686085" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.505037 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-688bb4b854-srcv6"] Oct 08 22:44:10 crc kubenswrapper[4834]: E1008 22:44:10.505441 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87464d1-1c42-43d1-b273-e25acf2895cd" containerName="mariadb-database-create" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.505465 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87464d1-1c42-43d1-b273-e25acf2895cd" containerName="mariadb-database-create" Oct 08 22:44:10 crc kubenswrapper[4834]: E1008 22:44:10.505479 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d656893-3446-42fe-86ad-74e1b9d7ecd5" containerName="barbican-db-sync" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.505488 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d656893-3446-42fe-86ad-74e1b9d7ecd5" containerName="barbican-db-sync" Oct 08 22:44:10 crc kubenswrapper[4834]: E1008 22:44:10.505500 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1ee346-0de8-44e6-a240-05669af5a41e" containerName="mariadb-database-create" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.505508 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1ee346-0de8-44e6-a240-05669af5a41e" containerName="mariadb-database-create" Oct 08 22:44:10 crc kubenswrapper[4834]: E1008 22:44:10.505524 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4adf21b4-b07b-41df-b887-29580e96f8b9" containerName="mariadb-database-create" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.505531 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4adf21b4-b07b-41df-b887-29580e96f8b9" containerName="mariadb-database-create" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.505703 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d656893-3446-42fe-86ad-74e1b9d7ecd5" containerName="barbican-db-sync" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.505721 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4adf21b4-b07b-41df-b887-29580e96f8b9" containerName="mariadb-database-create" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.505732 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87464d1-1c42-43d1-b273-e25acf2895cd" containerName="mariadb-database-create" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.505753 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d1ee346-0de8-44e6-a240-05669af5a41e" containerName="mariadb-database-create" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.506765 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.509928 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.511867 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.512124 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zsfx5" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.532055 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-688bb4b854-srcv6"] Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.562843 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7779b9cfc5-lq477"] Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.564228 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.567661 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.582714 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-config-data-custom\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.582759 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-combined-ca-bundle\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.582787 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6122ff69-d6fb-4002-8679-80b826faf58f-logs\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.582831 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f7d4f35-145c-4af9-9f4b-de8700877370-logs\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.582894 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-config-data-custom\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.582929 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-combined-ca-bundle\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.582961 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2f6g\" (UniqueName: \"kubernetes.io/projected/2f7d4f35-145c-4af9-9f4b-de8700877370-kube-api-access-p2f6g\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.583013 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-config-data\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.583043 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkzjq\" (UniqueName: \"kubernetes.io/projected/6122ff69-d6fb-4002-8679-80b826faf58f-kube-api-access-mkzjq\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.583094 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-config-data\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.616682 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7779b9cfc5-lq477"] Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.651343 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54df4b685c-fp7gd"] Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.654683 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.666399 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54df4b685c-fp7gd"] Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.684675 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-config-data\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.684737 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkzjq\" (UniqueName: \"kubernetes.io/projected/6122ff69-d6fb-4002-8679-80b826faf58f-kube-api-access-mkzjq\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.684795 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-config-data\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.684835 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-ovsdbserver-nb\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.684879 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-config-data-custom\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.684900 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6122ff69-d6fb-4002-8679-80b826faf58f-logs\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.684919 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-combined-ca-bundle\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.684955 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-config\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.684988 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f7d4f35-145c-4af9-9f4b-de8700877370-logs\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.685043 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-dns-svc\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.685061 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-dns-swift-storage-0\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.685078 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn44d\" (UniqueName: \"kubernetes.io/projected/f24a9f56-4cc7-40e7-abde-51cd4adf512e-kube-api-access-dn44d\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.685113 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-config-data-custom\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.685158 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-combined-ca-bundle\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.685186 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2f6g\" (UniqueName: \"kubernetes.io/projected/2f7d4f35-145c-4af9-9f4b-de8700877370-kube-api-access-p2f6g\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.685207 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-ovsdbserver-sb\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.687340 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f7d4f35-145c-4af9-9f4b-de8700877370-logs\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.688379 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6122ff69-d6fb-4002-8679-80b826faf58f-logs\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.691515 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-config-data-custom\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.692321 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-combined-ca-bundle\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.692886 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-combined-ca-bundle\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.694908 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-config-data\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.696605 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-config-data\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.705561 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-config-data-custom\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.706881 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkzjq\" (UniqueName: \"kubernetes.io/projected/6122ff69-d6fb-4002-8679-80b826faf58f-kube-api-access-mkzjq\") pod \"barbican-keystone-listener-688bb4b854-srcv6\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.716055 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2f6g\" (UniqueName: \"kubernetes.io/projected/2f7d4f35-145c-4af9-9f4b-de8700877370-kube-api-access-p2f6g\") pod \"barbican-worker-7779b9cfc5-lq477\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.743661 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-787c99cdfb-4zdjc"] Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.750617 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.757777 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.796898 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-787c99cdfb-4zdjc"] Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.802683 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-ovsdbserver-nb\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.802971 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-combined-ca-bundle\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.803241 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-config\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.803396 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3553ff9f-9a8d-40bc-919a-f6a400f001f6-logs\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.803471 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9bnp\" (UniqueName: \"kubernetes.io/projected/3553ff9f-9a8d-40bc-919a-f6a400f001f6-kube-api-access-c9bnp\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.803554 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-config-data-custom\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.803678 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-dns-svc\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.803714 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-dns-swift-storage-0\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.803757 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn44d\" (UniqueName: \"kubernetes.io/projected/f24a9f56-4cc7-40e7-abde-51cd4adf512e-kube-api-access-dn44d\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.803856 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-ovsdbserver-sb\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.803957 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-config-data\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.804180 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-config\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.805580 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-dns-swift-storage-0\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.806203 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-ovsdbserver-sb\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.806890 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-dns-svc\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.814411 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-ovsdbserver-nb\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.819313 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn44d\" (UniqueName: \"kubernetes.io/projected/f24a9f56-4cc7-40e7-abde-51cd4adf512e-kube-api-access-dn44d\") pod \"dnsmasq-dns-54df4b685c-fp7gd\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.838997 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.905874 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-combined-ca-bundle\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.905974 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3553ff9f-9a8d-40bc-919a-f6a400f001f6-logs\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.906004 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9bnp\" (UniqueName: \"kubernetes.io/projected/3553ff9f-9a8d-40bc-919a-f6a400f001f6-kube-api-access-c9bnp\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.906047 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-config-data-custom\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.906227 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-config-data\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.906721 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3553ff9f-9a8d-40bc-919a-f6a400f001f6-logs\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.910908 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-combined-ca-bundle\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.913094 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-config-data-custom\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.917198 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-config-data\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.920668 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9bnp\" (UniqueName: \"kubernetes.io/projected/3553ff9f-9a8d-40bc-919a-f6a400f001f6-kube-api-access-c9bnp\") pod \"barbican-api-787c99cdfb-4zdjc\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.921201 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:44:10 crc kubenswrapper[4834]: I1008 22:44:10.972185 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.099458 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.303704 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.329634 4834 generic.go:334] "Generic (PLEG): container finished" podID="389aef32-a13f-4210-889b-5f2eb59fae52" containerID="6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9" exitCode=0 Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.329664 4834 generic.go:334] "Generic (PLEG): container finished" podID="389aef32-a13f-4210-889b-5f2eb59fae52" containerID="1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5" exitCode=2 Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.329672 4834 generic.go:334] "Generic (PLEG): container finished" podID="389aef32-a13f-4210-889b-5f2eb59fae52" containerID="84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f" exitCode=0 Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.329679 4834 generic.go:334] "Generic (PLEG): container finished" podID="389aef32-a13f-4210-889b-5f2eb59fae52" containerID="0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db" exitCode=0 Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.329732 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.330355 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-688bb4b854-srcv6"] Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.330411 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389aef32-a13f-4210-889b-5f2eb59fae52","Type":"ContainerDied","Data":"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9"} Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.330452 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389aef32-a13f-4210-889b-5f2eb59fae52","Type":"ContainerDied","Data":"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5"} Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.330471 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389aef32-a13f-4210-889b-5f2eb59fae52","Type":"ContainerDied","Data":"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f"} Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.330483 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389aef32-a13f-4210-889b-5f2eb59fae52","Type":"ContainerDied","Data":"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db"} Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.330495 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"389aef32-a13f-4210-889b-5f2eb59fae52","Type":"ContainerDied","Data":"4fda5c44add7e2a9493025b1b4228509337dd0e13490570874aae1b21fe93f26"} Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.330516 4834 scope.go:117] "RemoveContainer" containerID="6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.351909 4834 scope.go:117] "RemoveContainer" containerID="1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.376864 4834 scope.go:117] "RemoveContainer" containerID="84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.397643 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7779b9cfc5-lq477"] Oct 08 22:44:11 crc kubenswrapper[4834]: W1008 22:44:11.401854 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f7d4f35_145c_4af9_9f4b_de8700877370.slice/crio-99e7181eaa3c23e0cb2c438664d40eff23644d038c17f82ce48912c5aa714f94 WatchSource:0}: Error finding container 99e7181eaa3c23e0cb2c438664d40eff23644d038c17f82ce48912c5aa714f94: Status 404 returned error can't find the container with id 99e7181eaa3c23e0cb2c438664d40eff23644d038c17f82ce48912c5aa714f94 Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.415634 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgxzn\" (UniqueName: \"kubernetes.io/projected/389aef32-a13f-4210-889b-5f2eb59fae52-kube-api-access-jgxzn\") pod \"389aef32-a13f-4210-889b-5f2eb59fae52\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.415714 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-config-data\") pod \"389aef32-a13f-4210-889b-5f2eb59fae52\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.415810 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-sg-core-conf-yaml\") pod \"389aef32-a13f-4210-889b-5f2eb59fae52\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.415866 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389aef32-a13f-4210-889b-5f2eb59fae52-run-httpd\") pod \"389aef32-a13f-4210-889b-5f2eb59fae52\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.415934 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389aef32-a13f-4210-889b-5f2eb59fae52-log-httpd\") pod \"389aef32-a13f-4210-889b-5f2eb59fae52\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.415978 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-combined-ca-bundle\") pod \"389aef32-a13f-4210-889b-5f2eb59fae52\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.416027 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-scripts\") pod \"389aef32-a13f-4210-889b-5f2eb59fae52\" (UID: \"389aef32-a13f-4210-889b-5f2eb59fae52\") " Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.417609 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/389aef32-a13f-4210-889b-5f2eb59fae52-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "389aef32-a13f-4210-889b-5f2eb59fae52" (UID: "389aef32-a13f-4210-889b-5f2eb59fae52"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.417828 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/389aef32-a13f-4210-889b-5f2eb59fae52-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "389aef32-a13f-4210-889b-5f2eb59fae52" (UID: "389aef32-a13f-4210-889b-5f2eb59fae52"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.438615 4834 scope.go:117] "RemoveContainer" containerID="0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.438642 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-scripts" (OuterVolumeSpecName: "scripts") pod "389aef32-a13f-4210-889b-5f2eb59fae52" (UID: "389aef32-a13f-4210-889b-5f2eb59fae52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.438822 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389aef32-a13f-4210-889b-5f2eb59fae52-kube-api-access-jgxzn" (OuterVolumeSpecName: "kube-api-access-jgxzn") pod "389aef32-a13f-4210-889b-5f2eb59fae52" (UID: "389aef32-a13f-4210-889b-5f2eb59fae52"). InnerVolumeSpecName "kube-api-access-jgxzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.449698 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "389aef32-a13f-4210-889b-5f2eb59fae52" (UID: "389aef32-a13f-4210-889b-5f2eb59fae52"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.469072 4834 scope.go:117] "RemoveContainer" containerID="6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9" Oct 08 22:44:11 crc kubenswrapper[4834]: E1008 22:44:11.471716 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9\": container with ID starting with 6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9 not found: ID does not exist" containerID="6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.471756 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9"} err="failed to get container status \"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9\": rpc error: code = NotFound desc = could not find container \"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9\": container with ID starting with 6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9 not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.471778 4834 scope.go:117] "RemoveContainer" containerID="1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5" Oct 08 22:44:11 crc kubenswrapper[4834]: E1008 22:44:11.472023 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5\": container with ID starting with 1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5 not found: ID does not exist" containerID="1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.472038 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5"} err="failed to get container status \"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5\": rpc error: code = NotFound desc = could not find container \"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5\": container with ID starting with 1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5 not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.472062 4834 scope.go:117] "RemoveContainer" containerID="84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f" Oct 08 22:44:11 crc kubenswrapper[4834]: E1008 22:44:11.472383 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f\": container with ID starting with 84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f not found: ID does not exist" containerID="84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.472402 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f"} err="failed to get container status \"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f\": rpc error: code = NotFound desc = could not find container \"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f\": container with ID starting with 84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.472417 4834 scope.go:117] "RemoveContainer" containerID="0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db" Oct 08 22:44:11 crc kubenswrapper[4834]: E1008 22:44:11.472581 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db\": container with ID starting with 0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db not found: ID does not exist" containerID="0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.472596 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db"} err="failed to get container status \"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db\": rpc error: code = NotFound desc = could not find container \"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db\": container with ID starting with 0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.472609 4834 scope.go:117] "RemoveContainer" containerID="6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.472788 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9"} err="failed to get container status \"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9\": rpc error: code = NotFound desc = could not find container \"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9\": container with ID starting with 6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9 not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.472802 4834 scope.go:117] "RemoveContainer" containerID="1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.472953 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5"} err="failed to get container status \"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5\": rpc error: code = NotFound desc = could not find container \"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5\": container with ID starting with 1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5 not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.472967 4834 scope.go:117] "RemoveContainer" containerID="84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.473106 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f"} err="failed to get container status \"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f\": rpc error: code = NotFound desc = could not find container \"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f\": container with ID starting with 84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.473121 4834 scope.go:117] "RemoveContainer" containerID="0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.473283 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db"} err="failed to get container status \"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db\": rpc error: code = NotFound desc = could not find container \"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db\": container with ID starting with 0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.473490 4834 scope.go:117] "RemoveContainer" containerID="6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.473790 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9"} err="failed to get container status \"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9\": rpc error: code = NotFound desc = could not find container \"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9\": container with ID starting with 6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9 not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.473805 4834 scope.go:117] "RemoveContainer" containerID="1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.474292 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5"} err="failed to get container status \"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5\": rpc error: code = NotFound desc = could not find container \"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5\": container with ID starting with 1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5 not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.474308 4834 scope.go:117] "RemoveContainer" containerID="84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.476895 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f"} err="failed to get container status \"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f\": rpc error: code = NotFound desc = could not find container \"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f\": container with ID starting with 84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.476922 4834 scope.go:117] "RemoveContainer" containerID="0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.478919 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db"} err="failed to get container status \"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db\": rpc error: code = NotFound desc = could not find container \"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db\": container with ID starting with 0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.478945 4834 scope.go:117] "RemoveContainer" containerID="6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.479213 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9"} err="failed to get container status \"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9\": rpc error: code = NotFound desc = could not find container \"6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9\": container with ID starting with 6293bc11d0077b930b373ac74f284c7aab767e042c1bb56222659cc756315da9 not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.479254 4834 scope.go:117] "RemoveContainer" containerID="1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.479738 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5"} err="failed to get container status \"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5\": rpc error: code = NotFound desc = could not find container \"1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5\": container with ID starting with 1642301d4cb4f064cb581ceb3c98689b0ce8adc9dbd8171a2a4bcac0bf3628e5 not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.480385 4834 scope.go:117] "RemoveContainer" containerID="84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.480757 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f"} err="failed to get container status \"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f\": rpc error: code = NotFound desc = could not find container \"84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f\": container with ID starting with 84665b80d40b1b6e11247bfbe4d04c635118d9831feb93ccb2ec708cdd0d737f not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.480805 4834 scope.go:117] "RemoveContainer" containerID="0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.481010 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db"} err="failed to get container status \"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db\": rpc error: code = NotFound desc = could not find container \"0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db\": container with ID starting with 0855e21e410995e83179e865ebca76e36ea78de756b003f18552b6b9434915db not found: ID does not exist" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.497477 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54df4b685c-fp7gd"] Oct 08 22:44:11 crc kubenswrapper[4834]: W1008 22:44:11.498449 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf24a9f56_4cc7_40e7_abde_51cd4adf512e.slice/crio-4566b216567a19d0966b7a5d3d4f221093b75585c3ecfded36206e29a6f77048 WatchSource:0}: Error finding container 4566b216567a19d0966b7a5d3d4f221093b75585c3ecfded36206e29a6f77048: Status 404 returned error can't find the container with id 4566b216567a19d0966b7a5d3d4f221093b75585c3ecfded36206e29a6f77048 Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.518822 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.518853 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389aef32-a13f-4210-889b-5f2eb59fae52-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.518867 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/389aef32-a13f-4210-889b-5f2eb59fae52-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.518879 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.518890 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgxzn\" (UniqueName: \"kubernetes.io/projected/389aef32-a13f-4210-889b-5f2eb59fae52-kube-api-access-jgxzn\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.519003 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "389aef32-a13f-4210-889b-5f2eb59fae52" (UID: "389aef32-a13f-4210-889b-5f2eb59fae52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.556093 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-config-data" (OuterVolumeSpecName: "config-data") pod "389aef32-a13f-4210-889b-5f2eb59fae52" (UID: "389aef32-a13f-4210-889b-5f2eb59fae52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.620853 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.621244 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/389aef32-a13f-4210-889b-5f2eb59fae52-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:11 crc kubenswrapper[4834]: W1008 22:44:11.670874 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3553ff9f_9a8d_40bc_919a_f6a400f001f6.slice/crio-132e0fabcc40740b5e055067a975da306a297f22445cb9111d2434dbedd54ece WatchSource:0}: Error finding container 132e0fabcc40740b5e055067a975da306a297f22445cb9111d2434dbedd54ece: Status 404 returned error can't find the container with id 132e0fabcc40740b5e055067a975da306a297f22445cb9111d2434dbedd54ece Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.671569 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-787c99cdfb-4zdjc"] Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.822417 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.830764 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.839389 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:11 crc kubenswrapper[4834]: E1008 22:44:11.839736 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="proxy-httpd" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.839751 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="proxy-httpd" Oct 08 22:44:11 crc kubenswrapper[4834]: E1008 22:44:11.839770 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="ceilometer-central-agent" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.839776 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="ceilometer-central-agent" Oct 08 22:44:11 crc kubenswrapper[4834]: E1008 22:44:11.839798 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="ceilometer-notification-agent" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.839804 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="ceilometer-notification-agent" Oct 08 22:44:11 crc kubenswrapper[4834]: E1008 22:44:11.839815 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="sg-core" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.839822 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="sg-core" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.839970 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="ceilometer-notification-agent" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.839980 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="ceilometer-central-agent" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.840000 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="sg-core" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.840014 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" containerName="proxy-httpd" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.841699 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.844310 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.844558 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.858690 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.925086 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gcr2\" (UniqueName: \"kubernetes.io/projected/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-kube-api-access-5gcr2\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.925213 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.925240 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.925437 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-scripts\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.925589 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-log-httpd\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.925708 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-run-httpd\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:11 crc kubenswrapper[4834]: I1008 22:44:11.925946 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-config-data\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.027379 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.027425 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.027458 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-scripts\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.027505 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-log-httpd\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.027550 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-run-httpd\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.027613 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-config-data\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.027647 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gcr2\" (UniqueName: \"kubernetes.io/projected/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-kube-api-access-5gcr2\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.028328 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-run-httpd\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.028594 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-log-httpd\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.031933 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-scripts\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.032992 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-config-data\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.033123 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.034119 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.062740 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gcr2\" (UniqueName: \"kubernetes.io/projected/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-kube-api-access-5gcr2\") pod \"ceilometer-0\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.163066 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.372394 4834 generic.go:334] "Generic (PLEG): container finished" podID="f24a9f56-4cc7-40e7-abde-51cd4adf512e" containerID="93a31efcab06567ca1c0b133ea347a1227179af25383d245fe3f60c5ba77f1fa" exitCode=0 Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.372543 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" event={"ID":"f24a9f56-4cc7-40e7-abde-51cd4adf512e","Type":"ContainerDied","Data":"93a31efcab06567ca1c0b133ea347a1227179af25383d245fe3f60c5ba77f1fa"} Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.372896 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" event={"ID":"f24a9f56-4cc7-40e7-abde-51cd4adf512e","Type":"ContainerStarted","Data":"4566b216567a19d0966b7a5d3d4f221093b75585c3ecfded36206e29a6f77048"} Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.381514 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7779b9cfc5-lq477" event={"ID":"2f7d4f35-145c-4af9-9f4b-de8700877370","Type":"ContainerStarted","Data":"99e7181eaa3c23e0cb2c438664d40eff23644d038c17f82ce48912c5aa714f94"} Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.385551 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" event={"ID":"6122ff69-d6fb-4002-8679-80b826faf58f","Type":"ContainerStarted","Data":"79458c5df48162ca6a34cac62d0eb56dd5e41f478eaa8a4c8e2da867f7740d30"} Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.387976 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787c99cdfb-4zdjc" event={"ID":"3553ff9f-9a8d-40bc-919a-f6a400f001f6","Type":"ContainerStarted","Data":"7ecc30acf1a932b56177f5900a0493e3740b2d798a4342e2a322d6c870308405"} Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.388018 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787c99cdfb-4zdjc" event={"ID":"3553ff9f-9a8d-40bc-919a-f6a400f001f6","Type":"ContainerStarted","Data":"7a7b33fee922cee7f61655aff3ab1dedaeaacce62bdf135a48bf5992989ac4bc"} Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.388027 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787c99cdfb-4zdjc" event={"ID":"3553ff9f-9a8d-40bc-919a-f6a400f001f6","Type":"ContainerStarted","Data":"132e0fabcc40740b5e055067a975da306a297f22445cb9111d2434dbedd54ece"} Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.389833 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.389866 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.429037 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-787c99cdfb-4zdjc" podStartSLOduration=2.429022888 podStartE2EDuration="2.429022888s" podCreationTimestamp="2025-10-08 22:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:44:12.427596743 +0000 UTC m=+1260.250481509" watchObservedRunningTime="2025-10-08 22:44:12.429022888 +0000 UTC m=+1260.251907634" Oct 08 22:44:12 crc kubenswrapper[4834]: I1008 22:44:12.643569 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:12 crc kubenswrapper[4834]: W1008 22:44:12.661925 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c5a9ba5_1282_44ad_9d4d_4e0d92d417d6.slice/crio-273c66e481c858aea64258286b63e356d68dd3f340a8085b174211399f740205 WatchSource:0}: Error finding container 273c66e481c858aea64258286b63e356d68dd3f340a8085b174211399f740205: Status 404 returned error can't find the container with id 273c66e481c858aea64258286b63e356d68dd3f340a8085b174211399f740205 Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.428508 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" event={"ID":"f24a9f56-4cc7-40e7-abde-51cd4adf512e","Type":"ContainerStarted","Data":"dcb1254ce3580bff4e9f72513cc0b7873f3c850fb133b093fcbbc0829a4931e1"} Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.431075 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.437478 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6","Type":"ContainerStarted","Data":"273c66e481c858aea64258286b63e356d68dd3f340a8085b174211399f740205"} Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.469532 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" podStartSLOduration=3.469513176 podStartE2EDuration="3.469513176s" podCreationTimestamp="2025-10-08 22:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:44:13.463801617 +0000 UTC m=+1261.286686363" watchObservedRunningTime="2025-10-08 22:44:13.469513176 +0000 UTC m=+1261.292397922" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.540805 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-9f9b7d4b4-cr99t"] Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.542867 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.547606 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.547887 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.616341 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="389aef32-a13f-4210-889b-5f2eb59fae52" path="/var/lib/kubelet/pods/389aef32-a13f-4210-889b-5f2eb59fae52/volumes" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.617358 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9f9b7d4b4-cr99t"] Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.670709 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-combined-ca-bundle\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.670749 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk6th\" (UniqueName: \"kubernetes.io/projected/a163bab0-7bd2-4272-a1f0-cd0090eed141-kube-api-access-fk6th\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.670862 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-config-data-custom\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.670881 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a163bab0-7bd2-4272-a1f0-cd0090eed141-logs\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.670942 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-internal-tls-certs\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.670981 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-config-data\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.671001 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-public-tls-certs\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.772746 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-config-data-custom\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.772818 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a163bab0-7bd2-4272-a1f0-cd0090eed141-logs\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.772865 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-internal-tls-certs\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.772896 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-config-data\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.772917 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-public-tls-certs\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.772945 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-combined-ca-bundle\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.772967 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk6th\" (UniqueName: \"kubernetes.io/projected/a163bab0-7bd2-4272-a1f0-cd0090eed141-kube-api-access-fk6th\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.773579 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a163bab0-7bd2-4272-a1f0-cd0090eed141-logs\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.778644 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-internal-tls-certs\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.779215 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-public-tls-certs\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.779250 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-config-data-custom\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.780032 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-config-data\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.780720 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-combined-ca-bundle\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.787826 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk6th\" (UniqueName: \"kubernetes.io/projected/a163bab0-7bd2-4272-a1f0-cd0090eed141-kube-api-access-fk6th\") pod \"barbican-api-9f9b7d4b4-cr99t\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:13 crc kubenswrapper[4834]: I1008 22:44:13.903739 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:14 crc kubenswrapper[4834]: I1008 22:44:14.448044 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6","Type":"ContainerStarted","Data":"6fd03c6dd6daf83c86d2e61b2d67db67bb5285413ea27c7475a5dd2ee69186d5"} Oct 08 22:44:14 crc kubenswrapper[4834]: I1008 22:44:14.451389 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7779b9cfc5-lq477" event={"ID":"2f7d4f35-145c-4af9-9f4b-de8700877370","Type":"ContainerStarted","Data":"49a4837887871793d5bd61522d4486481e468e9ccbe44afb693d4cf994046387"} Oct 08 22:44:14 crc kubenswrapper[4834]: I1008 22:44:14.453439 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" event={"ID":"6122ff69-d6fb-4002-8679-80b826faf58f","Type":"ContainerStarted","Data":"e0129efbb3686c4ac85c467d7e795d114a6de5a26a31044df0216a19b3300c6a"} Oct 08 22:44:14 crc kubenswrapper[4834]: I1008 22:44:14.520638 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9f9b7d4b4-cr99t"] Oct 08 22:44:15 crc kubenswrapper[4834]: I1008 22:44:15.462603 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-449pd" event={"ID":"15a24e03-f3be-433f-bbc1-3a25da713c65","Type":"ContainerStarted","Data":"57b633ed2d101f8136f8501326ad9126727b1b6fbd68324d56f1644808d3cdaf"} Oct 08 22:44:15 crc kubenswrapper[4834]: I1008 22:44:15.464947 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6","Type":"ContainerStarted","Data":"f41a40eb960ca8b363cc83eacc21ef7d8ff92838fe25f3e4f2aaec45e768a33b"} Oct 08 22:44:15 crc kubenswrapper[4834]: I1008 22:44:15.466696 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7779b9cfc5-lq477" event={"ID":"2f7d4f35-145c-4af9-9f4b-de8700877370","Type":"ContainerStarted","Data":"91bc866a40d8cd6ae97cd0196df92bd74f8bc6d69ab347d97b4b45e357486e30"} Oct 08 22:44:15 crc kubenswrapper[4834]: I1008 22:44:15.469221 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" event={"ID":"6122ff69-d6fb-4002-8679-80b826faf58f","Type":"ContainerStarted","Data":"f4112eae344e33c2ed1e518a1ec200f2e5a71736e028860090c20eae2cea0902"} Oct 08 22:44:15 crc kubenswrapper[4834]: I1008 22:44:15.472000 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9f9b7d4b4-cr99t" event={"ID":"a163bab0-7bd2-4272-a1f0-cd0090eed141","Type":"ContainerStarted","Data":"fd63b549a1986038c721e698592fdda290ebfb583fc9734ad9ac998ea85e14d3"} Oct 08 22:44:15 crc kubenswrapper[4834]: I1008 22:44:15.472050 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9f9b7d4b4-cr99t" event={"ID":"a163bab0-7bd2-4272-a1f0-cd0090eed141","Type":"ContainerStarted","Data":"ee03e22162c15519f33753c495b20abc4d67e8ca443ee3f90594ba5404a7a3a2"} Oct 08 22:44:15 crc kubenswrapper[4834]: I1008 22:44:15.472067 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9f9b7d4b4-cr99t" event={"ID":"a163bab0-7bd2-4272-a1f0-cd0090eed141","Type":"ContainerStarted","Data":"b90057f3d2386a8e0be598e7cb620f1af13ad5d1bd77db7b16ee6b1aecff30d7"} Oct 08 22:44:15 crc kubenswrapper[4834]: I1008 22:44:15.485851 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-449pd" podStartSLOduration=3.116377155 podStartE2EDuration="2m3.485828045s" podCreationTimestamp="2025-10-08 22:42:12 +0000 UTC" firstStartedPulling="2025-10-08 22:42:13.63204248 +0000 UTC m=+1141.454927226" lastFinishedPulling="2025-10-08 22:44:14.00149337 +0000 UTC m=+1261.824378116" observedRunningTime="2025-10-08 22:44:15.479391098 +0000 UTC m=+1263.302275844" watchObservedRunningTime="2025-10-08 22:44:15.485828045 +0000 UTC m=+1263.308712791" Oct 08 22:44:15 crc kubenswrapper[4834]: I1008 22:44:15.501086 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-9f9b7d4b4-cr99t" podStartSLOduration=2.501068513 podStartE2EDuration="2.501068513s" podCreationTimestamp="2025-10-08 22:44:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:44:15.496093413 +0000 UTC m=+1263.318978179" watchObservedRunningTime="2025-10-08 22:44:15.501068513 +0000 UTC m=+1263.323953259" Oct 08 22:44:15 crc kubenswrapper[4834]: I1008 22:44:15.526208 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7779b9cfc5-lq477" podStartSLOduration=2.965684605 podStartE2EDuration="5.526190653s" podCreationTimestamp="2025-10-08 22:44:10 +0000 UTC" firstStartedPulling="2025-10-08 22:44:11.441678649 +0000 UTC m=+1259.264563395" lastFinishedPulling="2025-10-08 22:44:14.002184697 +0000 UTC m=+1261.825069443" observedRunningTime="2025-10-08 22:44:15.523925138 +0000 UTC m=+1263.346809894" watchObservedRunningTime="2025-10-08 22:44:15.526190653 +0000 UTC m=+1263.349075399" Oct 08 22:44:15 crc kubenswrapper[4834]: I1008 22:44:15.544898 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" podStartSLOduration=2.874092144 podStartE2EDuration="5.544878627s" podCreationTimestamp="2025-10-08 22:44:10 +0000 UTC" firstStartedPulling="2025-10-08 22:44:11.331269881 +0000 UTC m=+1259.154154627" lastFinishedPulling="2025-10-08 22:44:14.002056364 +0000 UTC m=+1261.824941110" observedRunningTime="2025-10-08 22:44:15.54175561 +0000 UTC m=+1263.364640356" watchObservedRunningTime="2025-10-08 22:44:15.544878627 +0000 UTC m=+1263.367763373" Oct 08 22:44:16 crc kubenswrapper[4834]: I1008 22:44:16.482777 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6","Type":"ContainerStarted","Data":"69d44aaa96bb878f01c2a032ad93197b30afe6ddbf9eaf8ed6898011eaef57b2"} Oct 08 22:44:16 crc kubenswrapper[4834]: I1008 22:44:16.484589 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bdkgz" event={"ID":"910a7214-6f0f-452b-adc2-91d1c2589d47","Type":"ContainerDied","Data":"299426ab604bd850b419cb8a481e6164770f8d6cf6ebc2e23b67097f996fed1d"} Oct 08 22:44:16 crc kubenswrapper[4834]: I1008 22:44:16.484021 4834 generic.go:334] "Generic (PLEG): container finished" podID="910a7214-6f0f-452b-adc2-91d1c2589d47" containerID="299426ab604bd850b419cb8a481e6164770f8d6cf6ebc2e23b67097f996fed1d" exitCode=0 Oct 08 22:44:16 crc kubenswrapper[4834]: I1008 22:44:16.485037 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:16 crc kubenswrapper[4834]: I1008 22:44:16.485582 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:17 crc kubenswrapper[4834]: I1008 22:44:17.845278 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bdkgz" Oct 08 22:44:17 crc kubenswrapper[4834]: I1008 22:44:17.972318 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/910a7214-6f0f-452b-adc2-91d1c2589d47-config\") pod \"910a7214-6f0f-452b-adc2-91d1c2589d47\" (UID: \"910a7214-6f0f-452b-adc2-91d1c2589d47\") " Oct 08 22:44:17 crc kubenswrapper[4834]: I1008 22:44:17.972378 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910a7214-6f0f-452b-adc2-91d1c2589d47-combined-ca-bundle\") pod \"910a7214-6f0f-452b-adc2-91d1c2589d47\" (UID: \"910a7214-6f0f-452b-adc2-91d1c2589d47\") " Oct 08 22:44:17 crc kubenswrapper[4834]: I1008 22:44:17.972411 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgvrz\" (UniqueName: \"kubernetes.io/projected/910a7214-6f0f-452b-adc2-91d1c2589d47-kube-api-access-kgvrz\") pod \"910a7214-6f0f-452b-adc2-91d1c2589d47\" (UID: \"910a7214-6f0f-452b-adc2-91d1c2589d47\") " Oct 08 22:44:17 crc kubenswrapper[4834]: I1008 22:44:17.975763 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/910a7214-6f0f-452b-adc2-91d1c2589d47-kube-api-access-kgvrz" (OuterVolumeSpecName: "kube-api-access-kgvrz") pod "910a7214-6f0f-452b-adc2-91d1c2589d47" (UID: "910a7214-6f0f-452b-adc2-91d1c2589d47"). InnerVolumeSpecName "kube-api-access-kgvrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:17 crc kubenswrapper[4834]: I1008 22:44:17.997882 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/910a7214-6f0f-452b-adc2-91d1c2589d47-config" (OuterVolumeSpecName: "config") pod "910a7214-6f0f-452b-adc2-91d1c2589d47" (UID: "910a7214-6f0f-452b-adc2-91d1c2589d47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.001387 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/910a7214-6f0f-452b-adc2-91d1c2589d47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "910a7214-6f0f-452b-adc2-91d1c2589d47" (UID: "910a7214-6f0f-452b-adc2-91d1c2589d47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.074521 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910a7214-6f0f-452b-adc2-91d1c2589d47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.074562 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgvrz\" (UniqueName: \"kubernetes.io/projected/910a7214-6f0f-452b-adc2-91d1c2589d47-kube-api-access-kgvrz\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.074575 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/910a7214-6f0f-452b-adc2-91d1c2589d47-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.187247 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.503377 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6","Type":"ContainerStarted","Data":"cf4e438e0139a5cad719de557e7678041bf950b143fa7f6c3833a48eaf063b91"} Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.503517 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.505110 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bdkgz" event={"ID":"910a7214-6f0f-452b-adc2-91d1c2589d47","Type":"ContainerDied","Data":"83a6100e1df923d9a3f912b359ca3541a31605b56844e62a6c02d5feafa5c6f0"} Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.505194 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83a6100e1df923d9a3f912b359ca3541a31605b56844e62a6c02d5feafa5c6f0" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.505248 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bdkgz" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.531240 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.344938805 podStartE2EDuration="7.531197314s" podCreationTimestamp="2025-10-08 22:44:11 +0000 UTC" firstStartedPulling="2025-10-08 22:44:12.673305483 +0000 UTC m=+1260.496190229" lastFinishedPulling="2025-10-08 22:44:17.859563992 +0000 UTC m=+1265.682448738" observedRunningTime="2025-10-08 22:44:18.529046641 +0000 UTC m=+1266.351931387" watchObservedRunningTime="2025-10-08 22:44:18.531197314 +0000 UTC m=+1266.354082070" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.752606 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54df4b685c-fp7gd"] Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.752832 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" podUID="f24a9f56-4cc7-40e7-abde-51cd4adf512e" containerName="dnsmasq-dns" containerID="cri-o://dcb1254ce3580bff4e9f72513cc0b7873f3c850fb133b093fcbbc0829a4931e1" gracePeriod=10 Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.754766 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.797381 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cddb74997-k5mcv"] Oct 08 22:44:18 crc kubenswrapper[4834]: E1008 22:44:18.797934 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910a7214-6f0f-452b-adc2-91d1c2589d47" containerName="neutron-db-sync" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.797945 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="910a7214-6f0f-452b-adc2-91d1c2589d47" containerName="neutron-db-sync" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.798107 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="910a7214-6f0f-452b-adc2-91d1c2589d47" containerName="neutron-db-sync" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.798963 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.834651 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cddb74997-k5mcv"] Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.889956 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc5q9\" (UniqueName: \"kubernetes.io/projected/2e4a70f3-08bc-4136-a541-5027038cf824-kube-api-access-lc5q9\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.890009 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-ovsdbserver-nb\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.890038 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-config\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.890079 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-dns-swift-storage-0\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.890233 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-ovsdbserver-sb\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.890353 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-dns-svc\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.935064 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5846bc496d-9g2mt"] Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.936820 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.939257 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-lfnk9" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.939627 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.939737 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.939920 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.943030 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5846bc496d-9g2mt"] Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.991555 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-dns-swift-storage-0\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.991602 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-ovndb-tls-certs\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.991638 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-ovsdbserver-sb\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.991698 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-combined-ca-bundle\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.991733 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-dns-svc\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.991769 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc5q9\" (UniqueName: \"kubernetes.io/projected/2e4a70f3-08bc-4136-a541-5027038cf824-kube-api-access-lc5q9\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.991793 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-ovsdbserver-nb\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.991811 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-config\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.991871 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-httpd-config\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.991893 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-config\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.991913 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf9jz\" (UniqueName: \"kubernetes.io/projected/ad08d0fa-74e3-4211-a991-3e12be132fca-kube-api-access-xf9jz\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.994639 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-ovsdbserver-sb\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.995130 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-dns-svc\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.995653 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-config\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.996216 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-ovsdbserver-nb\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:18 crc kubenswrapper[4834]: I1008 22:44:18.997506 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-dns-swift-storage-0\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.015263 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc5q9\" (UniqueName: \"kubernetes.io/projected/2e4a70f3-08bc-4136-a541-5027038cf824-kube-api-access-lc5q9\") pod \"dnsmasq-dns-cddb74997-k5mcv\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.093364 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-httpd-config\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.093429 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-config\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.093458 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf9jz\" (UniqueName: \"kubernetes.io/projected/ad08d0fa-74e3-4211-a991-3e12be132fca-kube-api-access-xf9jz\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.093500 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-ovndb-tls-certs\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.093608 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-combined-ca-bundle\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.099371 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-httpd-config\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.124264 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf9jz\" (UniqueName: \"kubernetes.io/projected/ad08d0fa-74e3-4211-a991-3e12be132fca-kube-api-access-xf9jz\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.126861 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-combined-ca-bundle\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.127041 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-config\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.178804 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-ovndb-tls-certs\") pod \"neutron-5846bc496d-9g2mt\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.221598 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.258354 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.331560 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.419044 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-dns-swift-storage-0\") pod \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.419199 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-config\") pod \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.419221 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-ovsdbserver-sb\") pod \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.419261 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn44d\" (UniqueName: \"kubernetes.io/projected/f24a9f56-4cc7-40e7-abde-51cd4adf512e-kube-api-access-dn44d\") pod \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.419332 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-dns-svc\") pod \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.419418 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-ovsdbserver-nb\") pod \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\" (UID: \"f24a9f56-4cc7-40e7-abde-51cd4adf512e\") " Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.441726 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f24a9f56-4cc7-40e7-abde-51cd4adf512e-kube-api-access-dn44d" (OuterVolumeSpecName: "kube-api-access-dn44d") pod "f24a9f56-4cc7-40e7-abde-51cd4adf512e" (UID: "f24a9f56-4cc7-40e7-abde-51cd4adf512e"). InnerVolumeSpecName "kube-api-access-dn44d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.521750 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn44d\" (UniqueName: \"kubernetes.io/projected/f24a9f56-4cc7-40e7-abde-51cd4adf512e-kube-api-access-dn44d\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.526684 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f24a9f56-4cc7-40e7-abde-51cd4adf512e" (UID: "f24a9f56-4cc7-40e7-abde-51cd4adf512e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.562523 4834 generic.go:334] "Generic (PLEG): container finished" podID="f24a9f56-4cc7-40e7-abde-51cd4adf512e" containerID="dcb1254ce3580bff4e9f72513cc0b7873f3c850fb133b093fcbbc0829a4931e1" exitCode=0 Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.562792 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.564054 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f24a9f56-4cc7-40e7-abde-51cd4adf512e" (UID: "f24a9f56-4cc7-40e7-abde-51cd4adf512e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.587529 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f24a9f56-4cc7-40e7-abde-51cd4adf512e" (UID: "f24a9f56-4cc7-40e7-abde-51cd4adf512e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.587967 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f24a9f56-4cc7-40e7-abde-51cd4adf512e" (UID: "f24a9f56-4cc7-40e7-abde-51cd4adf512e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.610935 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-config" (OuterVolumeSpecName: "config") pod "f24a9f56-4cc7-40e7-abde-51cd4adf512e" (UID: "f24a9f56-4cc7-40e7-abde-51cd4adf512e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.626304 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.626511 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.626522 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.626531 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.626539 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24a9f56-4cc7-40e7-abde-51cd4adf512e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.673862 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" event={"ID":"f24a9f56-4cc7-40e7-abde-51cd4adf512e","Type":"ContainerDied","Data":"dcb1254ce3580bff4e9f72513cc0b7873f3c850fb133b093fcbbc0829a4931e1"} Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.673906 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54df4b685c-fp7gd" event={"ID":"f24a9f56-4cc7-40e7-abde-51cd4adf512e","Type":"ContainerDied","Data":"4566b216567a19d0966b7a5d3d4f221093b75585c3ecfded36206e29a6f77048"} Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.673929 4834 scope.go:117] "RemoveContainer" containerID="dcb1254ce3580bff4e9f72513cc0b7873f3c850fb133b093fcbbc0829a4931e1" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.714719 4834 scope.go:117] "RemoveContainer" containerID="93a31efcab06567ca1c0b133ea347a1227179af25383d245fe3f60c5ba77f1fa" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.791904 4834 scope.go:117] "RemoveContainer" containerID="dcb1254ce3580bff4e9f72513cc0b7873f3c850fb133b093fcbbc0829a4931e1" Oct 08 22:44:19 crc kubenswrapper[4834]: E1008 22:44:19.792786 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcb1254ce3580bff4e9f72513cc0b7873f3c850fb133b093fcbbc0829a4931e1\": container with ID starting with dcb1254ce3580bff4e9f72513cc0b7873f3c850fb133b093fcbbc0829a4931e1 not found: ID does not exist" containerID="dcb1254ce3580bff4e9f72513cc0b7873f3c850fb133b093fcbbc0829a4931e1" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.792823 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcb1254ce3580bff4e9f72513cc0b7873f3c850fb133b093fcbbc0829a4931e1"} err="failed to get container status \"dcb1254ce3580bff4e9f72513cc0b7873f3c850fb133b093fcbbc0829a4931e1\": rpc error: code = NotFound desc = could not find container \"dcb1254ce3580bff4e9f72513cc0b7873f3c850fb133b093fcbbc0829a4931e1\": container with ID starting with dcb1254ce3580bff4e9f72513cc0b7873f3c850fb133b093fcbbc0829a4931e1 not found: ID does not exist" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.794073 4834 scope.go:117] "RemoveContainer" containerID="93a31efcab06567ca1c0b133ea347a1227179af25383d245fe3f60c5ba77f1fa" Oct 08 22:44:19 crc kubenswrapper[4834]: E1008 22:44:19.796480 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93a31efcab06567ca1c0b133ea347a1227179af25383d245fe3f60c5ba77f1fa\": container with ID starting with 93a31efcab06567ca1c0b133ea347a1227179af25383d245fe3f60c5ba77f1fa not found: ID does not exist" containerID="93a31efcab06567ca1c0b133ea347a1227179af25383d245fe3f60c5ba77f1fa" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.796520 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93a31efcab06567ca1c0b133ea347a1227179af25383d245fe3f60c5ba77f1fa"} err="failed to get container status \"93a31efcab06567ca1c0b133ea347a1227179af25383d245fe3f60c5ba77f1fa\": rpc error: code = NotFound desc = could not find container \"93a31efcab06567ca1c0b133ea347a1227179af25383d245fe3f60c5ba77f1fa\": container with ID starting with 93a31efcab06567ca1c0b133ea347a1227179af25383d245fe3f60c5ba77f1fa not found: ID does not exist" Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.893198 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54df4b685c-fp7gd"] Oct 08 22:44:19 crc kubenswrapper[4834]: I1008 22:44:19.904948 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54df4b685c-fp7gd"] Oct 08 22:44:20 crc kubenswrapper[4834]: I1008 22:44:20.016874 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cddb74997-k5mcv"] Oct 08 22:44:20 crc kubenswrapper[4834]: I1008 22:44:20.081422 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:20 crc kubenswrapper[4834]: I1008 22:44:20.136207 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:44:20 crc kubenswrapper[4834]: I1008 22:44:20.157887 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:44:20 crc kubenswrapper[4834]: I1008 22:44:20.288821 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5846bc496d-9g2mt"] Oct 08 22:44:20 crc kubenswrapper[4834]: I1008 22:44:20.581306 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5846bc496d-9g2mt" event={"ID":"ad08d0fa-74e3-4211-a991-3e12be132fca","Type":"ContainerStarted","Data":"7f7fcb3fed5113e971d22256630f2029c631e9bfd72d57cac053b2e6f4968522"} Oct 08 22:44:20 crc kubenswrapper[4834]: I1008 22:44:20.591559 4834 generic.go:334] "Generic (PLEG): container finished" podID="2e4a70f3-08bc-4136-a541-5027038cf824" containerID="c1f65059306adb156fafa469244daa446a690f724849d1ee8757c325fabaefa7" exitCode=0 Oct 08 22:44:20 crc kubenswrapper[4834]: I1008 22:44:20.591621 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cddb74997-k5mcv" event={"ID":"2e4a70f3-08bc-4136-a541-5027038cf824","Type":"ContainerDied","Data":"c1f65059306adb156fafa469244daa446a690f724849d1ee8757c325fabaefa7"} Oct 08 22:44:20 crc kubenswrapper[4834]: I1008 22:44:20.591649 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cddb74997-k5mcv" event={"ID":"2e4a70f3-08bc-4136-a541-5027038cf824","Type":"ContainerStarted","Data":"00c713db00bbc8abd08b68a986b96fa9e8fc10375ce25acc1ae457e8ffea6479"} Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.254288 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f6cc747c5-vzjm2"] Oct 08 22:44:21 crc kubenswrapper[4834]: E1008 22:44:21.255681 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24a9f56-4cc7-40e7-abde-51cd4adf512e" containerName="init" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.255703 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24a9f56-4cc7-40e7-abde-51cd4adf512e" containerName="init" Oct 08 22:44:21 crc kubenswrapper[4834]: E1008 22:44:21.255719 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24a9f56-4cc7-40e7-abde-51cd4adf512e" containerName="dnsmasq-dns" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.255727 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24a9f56-4cc7-40e7-abde-51cd4adf512e" containerName="dnsmasq-dns" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.255980 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f24a9f56-4cc7-40e7-abde-51cd4adf512e" containerName="dnsmasq-dns" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.257410 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.264099 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.264471 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.269497 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-public-tls-certs\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.269525 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-combined-ca-bundle\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.269548 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqbkx\" (UniqueName: \"kubernetes.io/projected/62795e13-2e9c-4656-ab88-8788e50d37c5-kube-api-access-wqbkx\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.269652 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-ovndb-tls-certs\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.269742 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-config\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.269812 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-internal-tls-certs\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.269969 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-httpd-config\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.271365 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f6cc747c5-vzjm2"] Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.371995 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-httpd-config\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.372077 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-public-tls-certs\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.372098 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-combined-ca-bundle\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.372118 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqbkx\" (UniqueName: \"kubernetes.io/projected/62795e13-2e9c-4656-ab88-8788e50d37c5-kube-api-access-wqbkx\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.372156 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-ovndb-tls-certs\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.372189 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-config\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.372223 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-internal-tls-certs\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.391607 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-httpd-config\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.391616 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-combined-ca-bundle\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.392370 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-ovndb-tls-certs\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.395416 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-config\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.399157 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-internal-tls-certs\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.407941 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-public-tls-certs\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.436918 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqbkx\" (UniqueName: \"kubernetes.io/projected/62795e13-2e9c-4656-ab88-8788e50d37c5-kube-api-access-wqbkx\") pod \"neutron-6f6cc747c5-vzjm2\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.567315 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f24a9f56-4cc7-40e7-abde-51cd4adf512e" path="/var/lib/kubelet/pods/f24a9f56-4cc7-40e7-abde-51cd4adf512e/volumes" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.573707 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.635043 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cddb74997-k5mcv" event={"ID":"2e4a70f3-08bc-4136-a541-5027038cf824","Type":"ContainerStarted","Data":"c0088ca879ba8d1ddb5b1b6afef95732182c674734219c0d0ffd9c984bdc02bb"} Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.636186 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.656212 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cddb74997-k5mcv" podStartSLOduration=3.656194973 podStartE2EDuration="3.656194973s" podCreationTimestamp="2025-10-08 22:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:44:21.650909635 +0000 UTC m=+1269.473794381" watchObservedRunningTime="2025-10-08 22:44:21.656194973 +0000 UTC m=+1269.479079719" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.668275 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5846bc496d-9g2mt" event={"ID":"ad08d0fa-74e3-4211-a991-3e12be132fca","Type":"ContainerStarted","Data":"d83b40373e83582320b645236aea12d90979891ced3452dc0393cc534b68cf06"} Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.668321 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5846bc496d-9g2mt" event={"ID":"ad08d0fa-74e3-4211-a991-3e12be132fca","Type":"ContainerStarted","Data":"a20ef257a47565feda7278ac014084ef785869b3b364fc58d6f71aabd1ca4509"} Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.669328 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.699242 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5846bc496d-9g2mt" podStartSLOduration=3.6992249470000003 podStartE2EDuration="3.699224947s" podCreationTimestamp="2025-10-08 22:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:44:21.692170026 +0000 UTC m=+1269.515054772" watchObservedRunningTime="2025-10-08 22:44:21.699224947 +0000 UTC m=+1269.522109683" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.807344 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-f0f4-account-create-hgs5p"] Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.809226 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f0f4-account-create-hgs5p" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.813774 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.825736 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f0f4-account-create-hgs5p"] Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.897975 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5mvh\" (UniqueName: \"kubernetes.io/projected/0ccc4594-4823-4520-af7d-213d6dac2490-kube-api-access-x5mvh\") pod \"nova-api-f0f4-account-create-hgs5p\" (UID: \"0ccc4594-4823-4520-af7d-213d6dac2490\") " pod="openstack/nova-api-f0f4-account-create-hgs5p" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.986334 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2ad8-account-create-x9wt9"] Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.987594 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2ad8-account-create-x9wt9" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.989575 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 08 22:44:21 crc kubenswrapper[4834]: I1008 22:44:21.993199 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2ad8-account-create-x9wt9"] Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:21.999715 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5mvh\" (UniqueName: \"kubernetes.io/projected/0ccc4594-4823-4520-af7d-213d6dac2490-kube-api-access-x5mvh\") pod \"nova-api-f0f4-account-create-hgs5p\" (UID: \"0ccc4594-4823-4520-af7d-213d6dac2490\") " pod="openstack/nova-api-f0f4-account-create-hgs5p" Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.030906 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5mvh\" (UniqueName: \"kubernetes.io/projected/0ccc4594-4823-4520-af7d-213d6dac2490-kube-api-access-x5mvh\") pod \"nova-api-f0f4-account-create-hgs5p\" (UID: \"0ccc4594-4823-4520-af7d-213d6dac2490\") " pod="openstack/nova-api-f0f4-account-create-hgs5p" Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.101670 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxltg\" (UniqueName: \"kubernetes.io/projected/6082b9c6-252a-49ee-bcd5-7d58bd99ff23-kube-api-access-rxltg\") pod \"nova-cell0-2ad8-account-create-x9wt9\" (UID: \"6082b9c6-252a-49ee-bcd5-7d58bd99ff23\") " pod="openstack/nova-cell0-2ad8-account-create-x9wt9" Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.141547 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f0f4-account-create-hgs5p" Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.176890 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.177407 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="ceilometer-central-agent" containerID="cri-o://6fd03c6dd6daf83c86d2e61b2d67db67bb5285413ea27c7475a5dd2ee69186d5" gracePeriod=30 Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.177469 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="proxy-httpd" containerID="cri-o://cf4e438e0139a5cad719de557e7678041bf950b143fa7f6c3833a48eaf063b91" gracePeriod=30 Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.177494 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="sg-core" containerID="cri-o://69d44aaa96bb878f01c2a032ad93197b30afe6ddbf9eaf8ed6898011eaef57b2" gracePeriod=30 Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.177509 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="ceilometer-notification-agent" containerID="cri-o://f41a40eb960ca8b363cc83eacc21ef7d8ff92838fe25f3e4f2aaec45e768a33b" gracePeriod=30 Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.203450 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxltg\" (UniqueName: \"kubernetes.io/projected/6082b9c6-252a-49ee-bcd5-7d58bd99ff23-kube-api-access-rxltg\") pod \"nova-cell0-2ad8-account-create-x9wt9\" (UID: \"6082b9c6-252a-49ee-bcd5-7d58bd99ff23\") " pod="openstack/nova-cell0-2ad8-account-create-x9wt9" Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.228524 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c373-account-create-ld9q7"] Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.229730 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c373-account-create-ld9q7" Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.235509 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.238624 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxltg\" (UniqueName: \"kubernetes.io/projected/6082b9c6-252a-49ee-bcd5-7d58bd99ff23-kube-api-access-rxltg\") pod \"nova-cell0-2ad8-account-create-x9wt9\" (UID: \"6082b9c6-252a-49ee-bcd5-7d58bd99ff23\") " pod="openstack/nova-cell0-2ad8-account-create-x9wt9" Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.252321 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c373-account-create-ld9q7"] Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.306836 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2ad8-account-create-x9wt9" Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.307674 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x2lc\" (UniqueName: \"kubernetes.io/projected/01c52817-5a8e-480e-847c-eceaba519de6-kube-api-access-4x2lc\") pod \"nova-cell1-c373-account-create-ld9q7\" (UID: \"01c52817-5a8e-480e-847c-eceaba519de6\") " pod="openstack/nova-cell1-c373-account-create-ld9q7" Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.362242 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f6cc747c5-vzjm2"] Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.409677 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x2lc\" (UniqueName: \"kubernetes.io/projected/01c52817-5a8e-480e-847c-eceaba519de6-kube-api-access-4x2lc\") pod \"nova-cell1-c373-account-create-ld9q7\" (UID: \"01c52817-5a8e-480e-847c-eceaba519de6\") " pod="openstack/nova-cell1-c373-account-create-ld9q7" Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.436751 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x2lc\" (UniqueName: \"kubernetes.io/projected/01c52817-5a8e-480e-847c-eceaba519de6-kube-api-access-4x2lc\") pod \"nova-cell1-c373-account-create-ld9q7\" (UID: \"01c52817-5a8e-480e-847c-eceaba519de6\") " pod="openstack/nova-cell1-c373-account-create-ld9q7" Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.648552 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c373-account-create-ld9q7" Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.680607 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f6cc747c5-vzjm2" event={"ID":"62795e13-2e9c-4656-ab88-8788e50d37c5","Type":"ContainerStarted","Data":"51b9abfcc3e72b36455f7cd5a244162d8f78ed0aca8fdc256e54b5fb3b57ab7e"} Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.681935 4834 generic.go:334] "Generic (PLEG): container finished" podID="15a24e03-f3be-433f-bbc1-3a25da713c65" containerID="57b633ed2d101f8136f8501326ad9126727b1b6fbd68324d56f1644808d3cdaf" exitCode=0 Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.682033 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-449pd" event={"ID":"15a24e03-f3be-433f-bbc1-3a25da713c65","Type":"ContainerDied","Data":"57b633ed2d101f8136f8501326ad9126727b1b6fbd68324d56f1644808d3cdaf"} Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.691328 4834 generic.go:334] "Generic (PLEG): container finished" podID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerID="cf4e438e0139a5cad719de557e7678041bf950b143fa7f6c3833a48eaf063b91" exitCode=0 Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.691358 4834 generic.go:334] "Generic (PLEG): container finished" podID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerID="69d44aaa96bb878f01c2a032ad93197b30afe6ddbf9eaf8ed6898011eaef57b2" exitCode=2 Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.691366 4834 generic.go:334] "Generic (PLEG): container finished" podID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerID="f41a40eb960ca8b363cc83eacc21ef7d8ff92838fe25f3e4f2aaec45e768a33b" exitCode=0 Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.691409 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6","Type":"ContainerDied","Data":"cf4e438e0139a5cad719de557e7678041bf950b143fa7f6c3833a48eaf063b91"} Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.691436 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6","Type":"ContainerDied","Data":"69d44aaa96bb878f01c2a032ad93197b30afe6ddbf9eaf8ed6898011eaef57b2"} Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.691445 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6","Type":"ContainerDied","Data":"f41a40eb960ca8b363cc83eacc21ef7d8ff92838fe25f3e4f2aaec45e768a33b"} Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.702208 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6","Type":"ContainerStarted","Data":"048b028a05ed9d1e34b226eae2432e5d45037864ccaf2322ecdfa230f03f479c"} Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.740389 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=12.490649025 podStartE2EDuration="48.740372281s" podCreationTimestamp="2025-10-08 22:43:34 +0000 UTC" firstStartedPulling="2025-10-08 22:43:45.325218336 +0000 UTC m=+1233.148103072" lastFinishedPulling="2025-10-08 22:44:21.574941582 +0000 UTC m=+1269.397826328" observedRunningTime="2025-10-08 22:44:22.724498026 +0000 UTC m=+1270.547382772" watchObservedRunningTime="2025-10-08 22:44:22.740372281 +0000 UTC m=+1270.563257027" Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.778424 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-f0f4-account-create-hgs5p"] Oct 08 22:44:22 crc kubenswrapper[4834]: I1008 22:44:22.936196 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2ad8-account-create-x9wt9"] Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.131349 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.227070 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c373-account-create-ld9q7"] Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.245192 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-scripts\") pod \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.245250 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-combined-ca-bundle\") pod \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.245296 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-log-httpd\") pod \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.245313 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-sg-core-conf-yaml\") pod \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.245372 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-run-httpd\") pod \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.245397 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-config-data\") pod \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.245467 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gcr2\" (UniqueName: \"kubernetes.io/projected/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-kube-api-access-5gcr2\") pod \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\" (UID: \"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6\") " Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.246017 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" (UID: "8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.246468 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" (UID: "8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.252394 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-kube-api-access-5gcr2" (OuterVolumeSpecName: "kube-api-access-5gcr2") pod "8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" (UID: "8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6"). InnerVolumeSpecName "kube-api-access-5gcr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.259328 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-scripts" (OuterVolumeSpecName: "scripts") pod "8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" (UID: "8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.277974 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" (UID: "8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.347360 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.347678 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.347687 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.347697 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.347705 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gcr2\" (UniqueName: \"kubernetes.io/projected/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-kube-api-access-5gcr2\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.357467 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" (UID: "8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.384504 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-config-data" (OuterVolumeSpecName: "config-data") pod "8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" (UID: "8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.449268 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.449299 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.711006 4834 generic.go:334] "Generic (PLEG): container finished" podID="01c52817-5a8e-480e-847c-eceaba519de6" containerID="d26dc2b24f75c6416e3da5affb9ed2e0641dc5567fec5b856133b898272689ca" exitCode=0 Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.711075 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c373-account-create-ld9q7" event={"ID":"01c52817-5a8e-480e-847c-eceaba519de6","Type":"ContainerDied","Data":"d26dc2b24f75c6416e3da5affb9ed2e0641dc5567fec5b856133b898272689ca"} Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.711538 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c373-account-create-ld9q7" event={"ID":"01c52817-5a8e-480e-847c-eceaba519de6","Type":"ContainerStarted","Data":"ddbc09e48e7dc61663515e9d55637d423405f582afa3d303e89f001ba59b9a0c"} Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.713812 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f6cc747c5-vzjm2" event={"ID":"62795e13-2e9c-4656-ab88-8788e50d37c5","Type":"ContainerStarted","Data":"4c640bd73e780e0d64b84db93f10b09fd43cf57588febe8f44765e1d6c226f04"} Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.713927 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f6cc747c5-vzjm2" event={"ID":"62795e13-2e9c-4656-ab88-8788e50d37c5","Type":"ContainerStarted","Data":"0994fee875d4da6ff390c9c0551593b552bdf75310faf54c410141d37e0f066c"} Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.714016 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.715382 4834 generic.go:334] "Generic (PLEG): container finished" podID="0ccc4594-4823-4520-af7d-213d6dac2490" containerID="2c075f9f313e798e44f653a417d4b03da76fd6a78630718227b5b296a78fcde3" exitCode=0 Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.715521 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f0f4-account-create-hgs5p" event={"ID":"0ccc4594-4823-4520-af7d-213d6dac2490","Type":"ContainerDied","Data":"2c075f9f313e798e44f653a417d4b03da76fd6a78630718227b5b296a78fcde3"} Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.715610 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f0f4-account-create-hgs5p" event={"ID":"0ccc4594-4823-4520-af7d-213d6dac2490","Type":"ContainerStarted","Data":"8ffa38a96c57a24cc3dee40ddc3755911b6a6c7ef442b71817a6d7b6c5da798a"} Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.717118 4834 generic.go:334] "Generic (PLEG): container finished" podID="6082b9c6-252a-49ee-bcd5-7d58bd99ff23" containerID="28b6df0ac931544576daa108f0b9d18f976008b14b16d6e24e5fafe553822bd9" exitCode=0 Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.717271 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2ad8-account-create-x9wt9" event={"ID":"6082b9c6-252a-49ee-bcd5-7d58bd99ff23","Type":"ContainerDied","Data":"28b6df0ac931544576daa108f0b9d18f976008b14b16d6e24e5fafe553822bd9"} Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.717360 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2ad8-account-create-x9wt9" event={"ID":"6082b9c6-252a-49ee-bcd5-7d58bd99ff23","Type":"ContainerStarted","Data":"ffcc5489d8685095732259011cb9f755df157abb44b5e9f4b90f8d697ca723b2"} Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.720294 4834 generic.go:334] "Generic (PLEG): container finished" podID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerID="6fd03c6dd6daf83c86d2e61b2d67db67bb5285413ea27c7475a5dd2ee69186d5" exitCode=0 Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.720382 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.720424 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6","Type":"ContainerDied","Data":"6fd03c6dd6daf83c86d2e61b2d67db67bb5285413ea27c7475a5dd2ee69186d5"} Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.720448 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6","Type":"ContainerDied","Data":"273c66e481c858aea64258286b63e356d68dd3f340a8085b174211399f740205"} Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.720464 4834 scope.go:117] "RemoveContainer" containerID="cf4e438e0139a5cad719de557e7678041bf950b143fa7f6c3833a48eaf063b91" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.756305 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f6cc747c5-vzjm2" podStartSLOduration=2.756283763 podStartE2EDuration="2.756283763s" podCreationTimestamp="2025-10-08 22:44:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:44:23.749378525 +0000 UTC m=+1271.572263281" watchObservedRunningTime="2025-10-08 22:44:23.756283763 +0000 UTC m=+1271.579168519" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.764136 4834 scope.go:117] "RemoveContainer" containerID="69d44aaa96bb878f01c2a032ad93197b30afe6ddbf9eaf8ed6898011eaef57b2" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.819277 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.837572 4834 scope.go:117] "RemoveContainer" containerID="f41a40eb960ca8b363cc83eacc21ef7d8ff92838fe25f3e4f2aaec45e768a33b" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.841906 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.859458 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:23 crc kubenswrapper[4834]: E1008 22:44:23.860460 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="ceilometer-central-agent" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.860478 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="ceilometer-central-agent" Oct 08 22:44:23 crc kubenswrapper[4834]: E1008 22:44:23.861460 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="sg-core" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.861512 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="sg-core" Oct 08 22:44:23 crc kubenswrapper[4834]: E1008 22:44:23.861528 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="proxy-httpd" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.861535 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="proxy-httpd" Oct 08 22:44:23 crc kubenswrapper[4834]: E1008 22:44:23.861564 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="ceilometer-notification-agent" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.861570 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="ceilometer-notification-agent" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.861980 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="proxy-httpd" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.862011 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="ceilometer-central-agent" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.862031 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="sg-core" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.862050 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" containerName="ceilometer-notification-agent" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.874266 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.880572 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.880753 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.894246 4834 scope.go:117] "RemoveContainer" containerID="6fd03c6dd6daf83c86d2e61b2d67db67bb5285413ea27c7475a5dd2ee69186d5" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.899007 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.928333 4834 scope.go:117] "RemoveContainer" containerID="cf4e438e0139a5cad719de557e7678041bf950b143fa7f6c3833a48eaf063b91" Oct 08 22:44:23 crc kubenswrapper[4834]: E1008 22:44:23.932310 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf4e438e0139a5cad719de557e7678041bf950b143fa7f6c3833a48eaf063b91\": container with ID starting with cf4e438e0139a5cad719de557e7678041bf950b143fa7f6c3833a48eaf063b91 not found: ID does not exist" containerID="cf4e438e0139a5cad719de557e7678041bf950b143fa7f6c3833a48eaf063b91" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.932352 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4e438e0139a5cad719de557e7678041bf950b143fa7f6c3833a48eaf063b91"} err="failed to get container status \"cf4e438e0139a5cad719de557e7678041bf950b143fa7f6c3833a48eaf063b91\": rpc error: code = NotFound desc = could not find container \"cf4e438e0139a5cad719de557e7678041bf950b143fa7f6c3833a48eaf063b91\": container with ID starting with cf4e438e0139a5cad719de557e7678041bf950b143fa7f6c3833a48eaf063b91 not found: ID does not exist" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.932384 4834 scope.go:117] "RemoveContainer" containerID="69d44aaa96bb878f01c2a032ad93197b30afe6ddbf9eaf8ed6898011eaef57b2" Oct 08 22:44:23 crc kubenswrapper[4834]: E1008 22:44:23.933219 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d44aaa96bb878f01c2a032ad93197b30afe6ddbf9eaf8ed6898011eaef57b2\": container with ID starting with 69d44aaa96bb878f01c2a032ad93197b30afe6ddbf9eaf8ed6898011eaef57b2 not found: ID does not exist" containerID="69d44aaa96bb878f01c2a032ad93197b30afe6ddbf9eaf8ed6898011eaef57b2" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.933275 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d44aaa96bb878f01c2a032ad93197b30afe6ddbf9eaf8ed6898011eaef57b2"} err="failed to get container status \"69d44aaa96bb878f01c2a032ad93197b30afe6ddbf9eaf8ed6898011eaef57b2\": rpc error: code = NotFound desc = could not find container \"69d44aaa96bb878f01c2a032ad93197b30afe6ddbf9eaf8ed6898011eaef57b2\": container with ID starting with 69d44aaa96bb878f01c2a032ad93197b30afe6ddbf9eaf8ed6898011eaef57b2 not found: ID does not exist" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.933301 4834 scope.go:117] "RemoveContainer" containerID="f41a40eb960ca8b363cc83eacc21ef7d8ff92838fe25f3e4f2aaec45e768a33b" Oct 08 22:44:23 crc kubenswrapper[4834]: E1008 22:44:23.934747 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f41a40eb960ca8b363cc83eacc21ef7d8ff92838fe25f3e4f2aaec45e768a33b\": container with ID starting with f41a40eb960ca8b363cc83eacc21ef7d8ff92838fe25f3e4f2aaec45e768a33b not found: ID does not exist" containerID="f41a40eb960ca8b363cc83eacc21ef7d8ff92838fe25f3e4f2aaec45e768a33b" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.934772 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41a40eb960ca8b363cc83eacc21ef7d8ff92838fe25f3e4f2aaec45e768a33b"} err="failed to get container status \"f41a40eb960ca8b363cc83eacc21ef7d8ff92838fe25f3e4f2aaec45e768a33b\": rpc error: code = NotFound desc = could not find container \"f41a40eb960ca8b363cc83eacc21ef7d8ff92838fe25f3e4f2aaec45e768a33b\": container with ID starting with f41a40eb960ca8b363cc83eacc21ef7d8ff92838fe25f3e4f2aaec45e768a33b not found: ID does not exist" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.934786 4834 scope.go:117] "RemoveContainer" containerID="6fd03c6dd6daf83c86d2e61b2d67db67bb5285413ea27c7475a5dd2ee69186d5" Oct 08 22:44:23 crc kubenswrapper[4834]: E1008 22:44:23.935066 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd03c6dd6daf83c86d2e61b2d67db67bb5285413ea27c7475a5dd2ee69186d5\": container with ID starting with 6fd03c6dd6daf83c86d2e61b2d67db67bb5285413ea27c7475a5dd2ee69186d5 not found: ID does not exist" containerID="6fd03c6dd6daf83c86d2e61b2d67db67bb5285413ea27c7475a5dd2ee69186d5" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.935100 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd03c6dd6daf83c86d2e61b2d67db67bb5285413ea27c7475a5dd2ee69186d5"} err="failed to get container status \"6fd03c6dd6daf83c86d2e61b2d67db67bb5285413ea27c7475a5dd2ee69186d5\": rpc error: code = NotFound desc = could not find container \"6fd03c6dd6daf83c86d2e61b2d67db67bb5285413ea27c7475a5dd2ee69186d5\": container with ID starting with 6fd03c6dd6daf83c86d2e61b2d67db67bb5285413ea27c7475a5dd2ee69186d5 not found: ID does not exist" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.965322 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-config-data\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.965584 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05900eae-0749-428d-9c98-65b20ccaef25-log-httpd\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.965675 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-scripts\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.965795 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05900eae-0749-428d-9c98-65b20ccaef25-run-httpd\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.965874 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.965968 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwwfk\" (UniqueName: \"kubernetes.io/projected/05900eae-0749-428d-9c98-65b20ccaef25-kube-api-access-xwwfk\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:23 crc kubenswrapper[4834]: I1008 22:44:23.966034 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.067733 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05900eae-0749-428d-9c98-65b20ccaef25-run-httpd\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.067785 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.067830 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwwfk\" (UniqueName: \"kubernetes.io/projected/05900eae-0749-428d-9c98-65b20ccaef25-kube-api-access-xwwfk\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.067849 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.067880 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-config-data\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.067902 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05900eae-0749-428d-9c98-65b20ccaef25-log-httpd\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.067935 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-scripts\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.068739 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05900eae-0749-428d-9c98-65b20ccaef25-run-httpd\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.068929 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05900eae-0749-428d-9c98-65b20ccaef25-log-httpd\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.077345 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-scripts\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.080957 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-config-data\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.092665 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.096640 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwwfk\" (UniqueName: \"kubernetes.io/projected/05900eae-0749-428d-9c98-65b20ccaef25-kube-api-access-xwwfk\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.116863 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.198822 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-449pd" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.206959 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.273661 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-config-data\") pod \"15a24e03-f3be-433f-bbc1-3a25da713c65\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.273739 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kckgb\" (UniqueName: \"kubernetes.io/projected/15a24e03-f3be-433f-bbc1-3a25da713c65-kube-api-access-kckgb\") pod \"15a24e03-f3be-433f-bbc1-3a25da713c65\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.273777 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-scripts\") pod \"15a24e03-f3be-433f-bbc1-3a25da713c65\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.273843 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-db-sync-config-data\") pod \"15a24e03-f3be-433f-bbc1-3a25da713c65\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.273863 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15a24e03-f3be-433f-bbc1-3a25da713c65-etc-machine-id\") pod \"15a24e03-f3be-433f-bbc1-3a25da713c65\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.273961 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-combined-ca-bundle\") pod \"15a24e03-f3be-433f-bbc1-3a25da713c65\" (UID: \"15a24e03-f3be-433f-bbc1-3a25da713c65\") " Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.276213 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15a24e03-f3be-433f-bbc1-3a25da713c65-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "15a24e03-f3be-433f-bbc1-3a25da713c65" (UID: "15a24e03-f3be-433f-bbc1-3a25da713c65"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.282218 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a24e03-f3be-433f-bbc1-3a25da713c65-kube-api-access-kckgb" (OuterVolumeSpecName: "kube-api-access-kckgb") pod "15a24e03-f3be-433f-bbc1-3a25da713c65" (UID: "15a24e03-f3be-433f-bbc1-3a25da713c65"). InnerVolumeSpecName "kube-api-access-kckgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.291001 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-scripts" (OuterVolumeSpecName: "scripts") pod "15a24e03-f3be-433f-bbc1-3a25da713c65" (UID: "15a24e03-f3be-433f-bbc1-3a25da713c65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.291066 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "15a24e03-f3be-433f-bbc1-3a25da713c65" (UID: "15a24e03-f3be-433f-bbc1-3a25da713c65"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.337952 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15a24e03-f3be-433f-bbc1-3a25da713c65" (UID: "15a24e03-f3be-433f-bbc1-3a25da713c65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.360310 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-config-data" (OuterVolumeSpecName: "config-data") pod "15a24e03-f3be-433f-bbc1-3a25da713c65" (UID: "15a24e03-f3be-433f-bbc1-3a25da713c65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.376873 4834 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.377228 4834 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15a24e03-f3be-433f-bbc1-3a25da713c65-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.377237 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.377245 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.377253 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kckgb\" (UniqueName: \"kubernetes.io/projected/15a24e03-f3be-433f-bbc1-3a25da713c65-kube-api-access-kckgb\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.377262 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a24e03-f3be-433f-bbc1-3a25da713c65-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.737677 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-449pd" event={"ID":"15a24e03-f3be-433f-bbc1-3a25da713c65","Type":"ContainerDied","Data":"07ae74e4e2b31e588e805946eb9d63d78dd157efad9f6c62d56e966b6884ca67"} Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.737711 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07ae74e4e2b31e588e805946eb9d63d78dd157efad9f6c62d56e966b6884ca67" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.737767 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-449pd" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.816940 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.928599 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:44:24 crc kubenswrapper[4834]: E1008 22:44:24.928976 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a24e03-f3be-433f-bbc1-3a25da713c65" containerName="cinder-db-sync" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.928990 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a24e03-f3be-433f-bbc1-3a25da713c65" containerName="cinder-db-sync" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.929195 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a24e03-f3be-433f-bbc1-3a25da713c65" containerName="cinder-db-sync" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.930069 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.936428 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.936722 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.936833 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rls8w" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.937009 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.951418 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.989030 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvppl\" (UniqueName: \"kubernetes.io/projected/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-kube-api-access-kvppl\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.989326 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-config-data\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.989442 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.989522 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.989620 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-scripts\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:24 crc kubenswrapper[4834]: I1008 22:44:24.989709 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.009557 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cddb74997-k5mcv"] Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.009943 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cddb74997-k5mcv" podUID="2e4a70f3-08bc-4136-a541-5027038cf824" containerName="dnsmasq-dns" containerID="cri-o://c0088ca879ba8d1ddb5b1b6afef95732182c674734219c0d0ffd9c984bdc02bb" gracePeriod=10 Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.033324 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.038423 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59b9656b65-lm9hg"] Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.039964 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.059495 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59b9656b65-lm9hg"] Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.096942 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-ovsdbserver-sb\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.097935 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvppl\" (UniqueName: \"kubernetes.io/projected/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-kube-api-access-kvppl\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.098340 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-dns-swift-storage-0\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.101188 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtgdx\" (UniqueName: \"kubernetes.io/projected/b67f945d-dab3-4ece-9627-d5891da263ae-kube-api-access-gtgdx\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.101325 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-config-data\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.101416 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-config\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.101501 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-ovsdbserver-nb\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.101605 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.101709 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.101782 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-scripts\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.101866 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.102022 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-dns-svc\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.103728 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.137514 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvppl\" (UniqueName: \"kubernetes.io/projected/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-kube-api-access-kvppl\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.138057 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-scripts\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.138855 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-config-data\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.143681 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.148423 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.157196 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.158673 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.162831 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.194284 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.205199 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0cf0722-5c65-4442-9537-9e3d5f5eb262-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.205265 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nt84\" (UniqueName: \"kubernetes.io/projected/a0cf0722-5c65-4442-9537-9e3d5f5eb262-kube-api-access-9nt84\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.205308 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-dns-svc\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.205335 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-config-data-custom\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.205351 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.205381 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-scripts\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.205398 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-ovsdbserver-sb\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.205413 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-config-data\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.205433 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0cf0722-5c65-4442-9537-9e3d5f5eb262-logs\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.205477 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-dns-swift-storage-0\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.205517 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtgdx\" (UniqueName: \"kubernetes.io/projected/b67f945d-dab3-4ece-9627-d5891da263ae-kube-api-access-gtgdx\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.205542 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-config\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.205564 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-ovsdbserver-nb\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.206769 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-dns-svc\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.207337 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-ovsdbserver-sb\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.208028 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-dns-swift-storage-0\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.208540 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-config\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.223335 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-ovsdbserver-nb\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.245651 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtgdx\" (UniqueName: \"kubernetes.io/projected/b67f945d-dab3-4ece-9627-d5891da263ae-kube-api-access-gtgdx\") pod \"dnsmasq-dns-59b9656b65-lm9hg\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.249019 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.309929 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nt84\" (UniqueName: \"kubernetes.io/projected/a0cf0722-5c65-4442-9537-9e3d5f5eb262-kube-api-access-9nt84\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.310020 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-config-data-custom\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.310039 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.310068 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-scripts\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.310090 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-config-data\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.310114 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0cf0722-5c65-4442-9537-9e3d5f5eb262-logs\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.310249 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0cf0722-5c65-4442-9537-9e3d5f5eb262-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.310339 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0cf0722-5c65-4442-9537-9e3d5f5eb262-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.311458 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0cf0722-5c65-4442-9537-9e3d5f5eb262-logs\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.318412 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-config-data-custom\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.322036 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-scripts\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.326746 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.327426 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-config-data\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.332322 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nt84\" (UniqueName: \"kubernetes.io/projected/a0cf0722-5c65-4442-9537-9e3d5f5eb262-kube-api-access-9nt84\") pod \"cinder-api-0\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.380011 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.409560 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2ad8-account-create-x9wt9" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.506745 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f0f4-account-create-hgs5p" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.512928 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.545461 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c373-account-create-ld9q7" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.546607 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxltg\" (UniqueName: \"kubernetes.io/projected/6082b9c6-252a-49ee-bcd5-7d58bd99ff23-kube-api-access-rxltg\") pod \"6082b9c6-252a-49ee-bcd5-7d58bd99ff23\" (UID: \"6082b9c6-252a-49ee-bcd5-7d58bd99ff23\") " Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.546797 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5mvh\" (UniqueName: \"kubernetes.io/projected/0ccc4594-4823-4520-af7d-213d6dac2490-kube-api-access-x5mvh\") pod \"0ccc4594-4823-4520-af7d-213d6dac2490\" (UID: \"0ccc4594-4823-4520-af7d-213d6dac2490\") " Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.563410 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6082b9c6-252a-49ee-bcd5-7d58bd99ff23-kube-api-access-rxltg" (OuterVolumeSpecName: "kube-api-access-rxltg") pod "6082b9c6-252a-49ee-bcd5-7d58bd99ff23" (UID: "6082b9c6-252a-49ee-bcd5-7d58bd99ff23"). InnerVolumeSpecName "kube-api-access-rxltg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.563567 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ccc4594-4823-4520-af7d-213d6dac2490-kube-api-access-x5mvh" (OuterVolumeSpecName: "kube-api-access-x5mvh") pod "0ccc4594-4823-4520-af7d-213d6dac2490" (UID: "0ccc4594-4823-4520-af7d-213d6dac2490"). InnerVolumeSpecName "kube-api-access-x5mvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.618068 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6" path="/var/lib/kubelet/pods/8c5a9ba5-1282-44ad-9d4d-4e0d92d417d6/volumes" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.648729 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x2lc\" (UniqueName: \"kubernetes.io/projected/01c52817-5a8e-480e-847c-eceaba519de6-kube-api-access-4x2lc\") pod \"01c52817-5a8e-480e-847c-eceaba519de6\" (UID: \"01c52817-5a8e-480e-847c-eceaba519de6\") " Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.649198 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5mvh\" (UniqueName: \"kubernetes.io/projected/0ccc4594-4823-4520-af7d-213d6dac2490-kube-api-access-x5mvh\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.649211 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxltg\" (UniqueName: \"kubernetes.io/projected/6082b9c6-252a-49ee-bcd5-7d58bd99ff23-kube-api-access-rxltg\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.658243 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01c52817-5a8e-480e-847c-eceaba519de6-kube-api-access-4x2lc" (OuterVolumeSpecName: "kube-api-access-4x2lc") pod "01c52817-5a8e-480e-847c-eceaba519de6" (UID: "01c52817-5a8e-480e-847c-eceaba519de6"). InnerVolumeSpecName "kube-api-access-4x2lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.752341 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x2lc\" (UniqueName: \"kubernetes.io/projected/01c52817-5a8e-480e-847c-eceaba519de6-kube-api-access-4x2lc\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.781135 4834 generic.go:334] "Generic (PLEG): container finished" podID="2e4a70f3-08bc-4136-a541-5027038cf824" containerID="c0088ca879ba8d1ddb5b1b6afef95732182c674734219c0d0ffd9c984bdc02bb" exitCode=0 Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.781356 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cddb74997-k5mcv" event={"ID":"2e4a70f3-08bc-4136-a541-5027038cf824","Type":"ContainerDied","Data":"c0088ca879ba8d1ddb5b1b6afef95732182c674734219c0d0ffd9c984bdc02bb"} Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.783590 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05900eae-0749-428d-9c98-65b20ccaef25","Type":"ContainerStarted","Data":"bc956c570db48b8cef2f7cedecc9281d4ea6cbb3af15e6de70c00d43a32d59ce"} Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.790791 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2ad8-account-create-x9wt9" event={"ID":"6082b9c6-252a-49ee-bcd5-7d58bd99ff23","Type":"ContainerDied","Data":"ffcc5489d8685095732259011cb9f755df157abb44b5e9f4b90f8d697ca723b2"} Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.790910 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffcc5489d8685095732259011cb9f755df157abb44b5e9f4b90f8d697ca723b2" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.791070 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2ad8-account-create-x9wt9" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.795449 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c373-account-create-ld9q7" event={"ID":"01c52817-5a8e-480e-847c-eceaba519de6","Type":"ContainerDied","Data":"ddbc09e48e7dc61663515e9d55637d423405f582afa3d303e89f001ba59b9a0c"} Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.795557 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddbc09e48e7dc61663515e9d55637d423405f582afa3d303e89f001ba59b9a0c" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.795688 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c373-account-create-ld9q7" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.823688 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-f0f4-account-create-hgs5p" event={"ID":"0ccc4594-4823-4520-af7d-213d6dac2490","Type":"ContainerDied","Data":"8ffa38a96c57a24cc3dee40ddc3755911b6a6c7ef442b71817a6d7b6c5da798a"} Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.823725 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ffa38a96c57a24cc3dee40ddc3755911b6a6c7ef442b71817a6d7b6c5da798a" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.823787 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-f0f4-account-create-hgs5p" Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.956524 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:44:25 crc kubenswrapper[4834]: I1008 22:44:25.970815 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.076080 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.148806 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.160945 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-ovsdbserver-nb\") pod \"2e4a70f3-08bc-4136-a541-5027038cf824\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.161019 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-dns-svc\") pod \"2e4a70f3-08bc-4136-a541-5027038cf824\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.161110 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc5q9\" (UniqueName: \"kubernetes.io/projected/2e4a70f3-08bc-4136-a541-5027038cf824-kube-api-access-lc5q9\") pod \"2e4a70f3-08bc-4136-a541-5027038cf824\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.161137 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-ovsdbserver-sb\") pod \"2e4a70f3-08bc-4136-a541-5027038cf824\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.161207 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-config\") pod \"2e4a70f3-08bc-4136-a541-5027038cf824\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.161241 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-dns-swift-storage-0\") pod \"2e4a70f3-08bc-4136-a541-5027038cf824\" (UID: \"2e4a70f3-08bc-4136-a541-5027038cf824\") " Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.170072 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4a70f3-08bc-4136-a541-5027038cf824-kube-api-access-lc5q9" (OuterVolumeSpecName: "kube-api-access-lc5q9") pod "2e4a70f3-08bc-4136-a541-5027038cf824" (UID: "2e4a70f3-08bc-4136-a541-5027038cf824"). InnerVolumeSpecName "kube-api-access-lc5q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.263528 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59b9656b65-lm9hg"] Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.264637 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc5q9\" (UniqueName: \"kubernetes.io/projected/2e4a70f3-08bc-4136-a541-5027038cf824-kube-api-access-lc5q9\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.327960 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2e4a70f3-08bc-4136-a541-5027038cf824" (UID: "2e4a70f3-08bc-4136-a541-5027038cf824"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.332533 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e4a70f3-08bc-4136-a541-5027038cf824" (UID: "2e4a70f3-08bc-4136-a541-5027038cf824"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.353256 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e4a70f3-08bc-4136-a541-5027038cf824" (UID: "2e4a70f3-08bc-4136-a541-5027038cf824"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.361264 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e4a70f3-08bc-4136-a541-5027038cf824" (UID: "2e4a70f3-08bc-4136-a541-5027038cf824"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.361403 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-config" (OuterVolumeSpecName: "config") pod "2e4a70f3-08bc-4136-a541-5027038cf824" (UID: "2e4a70f3-08bc-4136-a541-5027038cf824"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.368246 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.368277 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.368287 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.368296 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.368303 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e4a70f3-08bc-4136-a541-5027038cf824-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.417830 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.419565 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cef930bb-c211-441f-a59f-e797704ce837" containerName="glance-log" containerID="cri-o://8e857802b4273bb3fc6019e44989d03f7abc803eb79bcd62026051a8aaa24e78" gracePeriod=30 Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.419896 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cef930bb-c211-441f-a59f-e797704ce837" containerName="glance-httpd" containerID="cri-o://417e9a1a9853a87e4154fd1f997b54fa716f0e33a21ddc62dd44f49568347aaf" gracePeriod=30 Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.539567 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.617905 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-787c99cdfb-4zdjc"] Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.618128 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-787c99cdfb-4zdjc" podUID="3553ff9f-9a8d-40bc-919a-f6a400f001f6" containerName="barbican-api-log" containerID="cri-o://7a7b33fee922cee7f61655aff3ab1dedaeaacce62bdf135a48bf5992989ac4bc" gracePeriod=30 Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.618501 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-787c99cdfb-4zdjc" podUID="3553ff9f-9a8d-40bc-919a-f6a400f001f6" containerName="barbican-api" containerID="cri-o://7ecc30acf1a932b56177f5900a0493e3740b2d798a4342e2a322d6c870308405" gracePeriod=30 Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.847469 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c","Type":"ContainerStarted","Data":"a505dd8a292b30869f8be60bf8c684cecb1cde6fd3d8ac051f7235125f810b2a"} Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.849268 4834 generic.go:334] "Generic (PLEG): container finished" podID="cef930bb-c211-441f-a59f-e797704ce837" containerID="8e857802b4273bb3fc6019e44989d03f7abc803eb79bcd62026051a8aaa24e78" exitCode=143 Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.849308 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cef930bb-c211-441f-a59f-e797704ce837","Type":"ContainerDied","Data":"8e857802b4273bb3fc6019e44989d03f7abc803eb79bcd62026051a8aaa24e78"} Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.874494 4834 generic.go:334] "Generic (PLEG): container finished" podID="3553ff9f-9a8d-40bc-919a-f6a400f001f6" containerID="7a7b33fee922cee7f61655aff3ab1dedaeaacce62bdf135a48bf5992989ac4bc" exitCode=143 Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.874589 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787c99cdfb-4zdjc" event={"ID":"3553ff9f-9a8d-40bc-919a-f6a400f001f6","Type":"ContainerDied","Data":"7a7b33fee922cee7f61655aff3ab1dedaeaacce62bdf135a48bf5992989ac4bc"} Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.876059 4834 generic.go:334] "Generic (PLEG): container finished" podID="b67f945d-dab3-4ece-9627-d5891da263ae" containerID="10dec061dba544e1b7211a31d01da7089008ca2b300d4f403582aec78220e5f6" exitCode=0 Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.876103 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" event={"ID":"b67f945d-dab3-4ece-9627-d5891da263ae","Type":"ContainerDied","Data":"10dec061dba544e1b7211a31d01da7089008ca2b300d4f403582aec78220e5f6"} Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.876121 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" event={"ID":"b67f945d-dab3-4ece-9627-d5891da263ae","Type":"ContainerStarted","Data":"4bf18755698eadac6a0b6a8c4d8cf1320b962288741f68254b85a65d7b63f371"} Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.893442 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0cf0722-5c65-4442-9537-9e3d5f5eb262","Type":"ContainerStarted","Data":"64bb81f51d6d028512e6af6b3fa2d7631efebc3c7166dd733a6a7333ea864ee9"} Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.909593 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cddb74997-k5mcv" event={"ID":"2e4a70f3-08bc-4136-a541-5027038cf824","Type":"ContainerDied","Data":"00c713db00bbc8abd08b68a986b96fa9e8fc10375ce25acc1ae457e8ffea6479"} Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.909644 4834 scope.go:117] "RemoveContainer" containerID="c0088ca879ba8d1ddb5b1b6afef95732182c674734219c0d0ffd9c984bdc02bb" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.909759 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cddb74997-k5mcv" Oct 08 22:44:26 crc kubenswrapper[4834]: I1008 22:44:26.940310 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05900eae-0749-428d-9c98-65b20ccaef25","Type":"ContainerStarted","Data":"cc045e43247a37d4a44ed55adfaf9586f56217a8ef6fd2d786ddc2a902bb8299"} Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.067513 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.107113 4834 scope.go:117] "RemoveContainer" containerID="c1f65059306adb156fafa469244daa446a690f724849d1ee8757c325fabaefa7" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.138357 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cddb74997-k5mcv"] Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.228808 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cddb74997-k5mcv"] Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.296638 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zp2gc"] Oct 08 22:44:27 crc kubenswrapper[4834]: E1008 22:44:27.297534 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01c52817-5a8e-480e-847c-eceaba519de6" containerName="mariadb-account-create" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.297547 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c52817-5a8e-480e-847c-eceaba519de6" containerName="mariadb-account-create" Oct 08 22:44:27 crc kubenswrapper[4834]: E1008 22:44:27.297563 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6082b9c6-252a-49ee-bcd5-7d58bd99ff23" containerName="mariadb-account-create" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.297569 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6082b9c6-252a-49ee-bcd5-7d58bd99ff23" containerName="mariadb-account-create" Oct 08 22:44:27 crc kubenswrapper[4834]: E1008 22:44:27.297583 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4a70f3-08bc-4136-a541-5027038cf824" containerName="dnsmasq-dns" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.297589 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4a70f3-08bc-4136-a541-5027038cf824" containerName="dnsmasq-dns" Oct 08 22:44:27 crc kubenswrapper[4834]: E1008 22:44:27.297614 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ccc4594-4823-4520-af7d-213d6dac2490" containerName="mariadb-account-create" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.297621 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ccc4594-4823-4520-af7d-213d6dac2490" containerName="mariadb-account-create" Oct 08 22:44:27 crc kubenswrapper[4834]: E1008 22:44:27.297629 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4a70f3-08bc-4136-a541-5027038cf824" containerName="init" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.297636 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4a70f3-08bc-4136-a541-5027038cf824" containerName="init" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.297828 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="01c52817-5a8e-480e-847c-eceaba519de6" containerName="mariadb-account-create" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.297899 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ccc4594-4823-4520-af7d-213d6dac2490" containerName="mariadb-account-create" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.297915 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e4a70f3-08bc-4136-a541-5027038cf824" containerName="dnsmasq-dns" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.297924 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6082b9c6-252a-49ee-bcd5-7d58bd99ff23" containerName="mariadb-account-create" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.299812 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.301636 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mrcjf" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.301851 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.302031 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.309752 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zp2gc"] Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.400092 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7d5c\" (UniqueName: \"kubernetes.io/projected/30f60664-0524-49bc-8e17-19305b2ae60a-kube-api-access-s7d5c\") pod \"nova-cell0-conductor-db-sync-zp2gc\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.400156 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-config-data\") pod \"nova-cell0-conductor-db-sync-zp2gc\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.400229 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zp2gc\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.400275 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-scripts\") pod \"nova-cell0-conductor-db-sync-zp2gc\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.501972 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7d5c\" (UniqueName: \"kubernetes.io/projected/30f60664-0524-49bc-8e17-19305b2ae60a-kube-api-access-s7d5c\") pod \"nova-cell0-conductor-db-sync-zp2gc\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.502036 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-config-data\") pod \"nova-cell0-conductor-db-sync-zp2gc\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.502071 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zp2gc\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.502122 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-scripts\") pod \"nova-cell0-conductor-db-sync-zp2gc\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.506661 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-scripts\") pod \"nova-cell0-conductor-db-sync-zp2gc\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.507539 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-config-data\") pod \"nova-cell0-conductor-db-sync-zp2gc\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.507970 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zp2gc\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.526517 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7d5c\" (UniqueName: \"kubernetes.io/projected/30f60664-0524-49bc-8e17-19305b2ae60a-kube-api-access-s7d5c\") pod \"nova-cell0-conductor-db-sync-zp2gc\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.571204 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e4a70f3-08bc-4136-a541-5027038cf824" path="/var/lib/kubelet/pods/2e4a70f3-08bc-4136-a541-5027038cf824/volumes" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.650989 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:27 crc kubenswrapper[4834]: I1008 22:44:27.999225 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0cf0722-5c65-4442-9537-9e3d5f5eb262","Type":"ContainerStarted","Data":"4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f"} Oct 08 22:44:28 crc kubenswrapper[4834]: I1008 22:44:28.009231 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05900eae-0749-428d-9c98-65b20ccaef25","Type":"ContainerStarted","Data":"cdbfd918937b6685ea1d96b1cec28360fa1081ade005c89ae879d54bc06f6228"} Oct 08 22:44:28 crc kubenswrapper[4834]: I1008 22:44:28.009268 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05900eae-0749-428d-9c98-65b20ccaef25","Type":"ContainerStarted","Data":"aa03ee65e2ccb19d2281ae90f69f5e354c069d95c67dae21cec48157be39120a"} Oct 08 22:44:28 crc kubenswrapper[4834]: I1008 22:44:28.010941 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" event={"ID":"b67f945d-dab3-4ece-9627-d5891da263ae","Type":"ContainerStarted","Data":"3599314830d6c060085be33d2da4be7769553f44c3fa06c55dee8c8207a178bd"} Oct 08 22:44:28 crc kubenswrapper[4834]: I1008 22:44:28.011399 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:28 crc kubenswrapper[4834]: I1008 22:44:28.042673 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" podStartSLOduration=3.042657584 podStartE2EDuration="3.042657584s" podCreationTimestamp="2025-10-08 22:44:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:44:28.040746607 +0000 UTC m=+1275.863631353" watchObservedRunningTime="2025-10-08 22:44:28.042657584 +0000 UTC m=+1275.865542330" Oct 08 22:44:28 crc kubenswrapper[4834]: I1008 22:44:28.272418 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zp2gc"] Oct 08 22:44:28 crc kubenswrapper[4834]: I1008 22:44:28.908705 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:44:29 crc kubenswrapper[4834]: I1008 22:44:29.021028 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zp2gc" event={"ID":"30f60664-0524-49bc-8e17-19305b2ae60a","Type":"ContainerStarted","Data":"af0f3b9a7d893187989ef544705b5f3a340f4f680bd904de95429d1b99a8b8f7"} Oct 08 22:44:29 crc kubenswrapper[4834]: I1008 22:44:29.025539 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c","Type":"ContainerStarted","Data":"334dbc351804b123ccb2ad2c99ced8191afb6411583d0b48bc1eb6e511322b49"} Oct 08 22:44:29 crc kubenswrapper[4834]: I1008 22:44:29.025645 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c","Type":"ContainerStarted","Data":"2ab134505a31e4a3aecfbd84147c1a11da7f89069d5db0e00164907b6204e8df"} Oct 08 22:44:29 crc kubenswrapper[4834]: I1008 22:44:29.029032 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0cf0722-5c65-4442-9537-9e3d5f5eb262","Type":"ContainerStarted","Data":"1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a"} Oct 08 22:44:29 crc kubenswrapper[4834]: I1008 22:44:29.029324 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 08 22:44:29 crc kubenswrapper[4834]: I1008 22:44:29.046800 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.958276088 podStartE2EDuration="5.04678543s" podCreationTimestamp="2025-10-08 22:44:24 +0000 UTC" firstStartedPulling="2025-10-08 22:44:25.982751259 +0000 UTC m=+1273.805636005" lastFinishedPulling="2025-10-08 22:44:27.071260601 +0000 UTC m=+1274.894145347" observedRunningTime="2025-10-08 22:44:29.043790518 +0000 UTC m=+1276.866675264" watchObservedRunningTime="2025-10-08 22:44:29.04678543 +0000 UTC m=+1276.869670176" Oct 08 22:44:29 crc kubenswrapper[4834]: I1008 22:44:29.064741 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.064726636 podStartE2EDuration="4.064726636s" podCreationTimestamp="2025-10-08 22:44:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:44:29.063469895 +0000 UTC m=+1276.886354641" watchObservedRunningTime="2025-10-08 22:44:29.064726636 +0000 UTC m=+1276.887611382" Oct 08 22:44:29 crc kubenswrapper[4834]: I1008 22:44:29.751514 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:44:29 crc kubenswrapper[4834]: I1008 22:44:29.752082 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b34caaaa-9ad3-42b8-8537-876601474580" containerName="glance-log" containerID="cri-o://dc5e2e8417ace5c999280b76594b36eb8fbcb6af9691cc8cce6e64f0a2c22444" gracePeriod=30 Oct 08 22:44:29 crc kubenswrapper[4834]: I1008 22:44:29.752402 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b34caaaa-9ad3-42b8-8537-876601474580" containerName="glance-httpd" containerID="cri-o://3d96780e3f5bf5401fcf705807bcc64c49484592f8ee3396cf64804cc4dd5a27" gracePeriod=30 Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.068680 4834 generic.go:334] "Generic (PLEG): container finished" podID="cef930bb-c211-441f-a59f-e797704ce837" containerID="417e9a1a9853a87e4154fd1f997b54fa716f0e33a21ddc62dd44f49568347aaf" exitCode=0 Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.068912 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cef930bb-c211-441f-a59f-e797704ce837","Type":"ContainerDied","Data":"417e9a1a9853a87e4154fd1f997b54fa716f0e33a21ddc62dd44f49568347aaf"} Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.105390 4834 generic.go:334] "Generic (PLEG): container finished" podID="3553ff9f-9a8d-40bc-919a-f6a400f001f6" containerID="7ecc30acf1a932b56177f5900a0493e3740b2d798a4342e2a322d6c870308405" exitCode=0 Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.105471 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787c99cdfb-4zdjc" event={"ID":"3553ff9f-9a8d-40bc-919a-f6a400f001f6","Type":"ContainerDied","Data":"7ecc30acf1a932b56177f5900a0493e3740b2d798a4342e2a322d6c870308405"} Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.109117 4834 generic.go:334] "Generic (PLEG): container finished" podID="b34caaaa-9ad3-42b8-8537-876601474580" containerID="dc5e2e8417ace5c999280b76594b36eb8fbcb6af9691cc8cce6e64f0a2c22444" exitCode=143 Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.109369 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b34caaaa-9ad3-42b8-8537-876601474580","Type":"ContainerDied","Data":"dc5e2e8417ace5c999280b76594b36eb8fbcb6af9691cc8cce6e64f0a2c22444"} Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.109465 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a0cf0722-5c65-4442-9537-9e3d5f5eb262" containerName="cinder-api-log" containerID="cri-o://4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f" gracePeriod=30 Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.109571 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a0cf0722-5c65-4442-9537-9e3d5f5eb262" containerName="cinder-api" containerID="cri-o://1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a" gracePeriod=30 Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.185595 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.250611 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.261781 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-scripts\") pod \"cef930bb-c211-441f-a59f-e797704ce837\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.261832 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-public-tls-certs\") pod \"cef930bb-c211-441f-a59f-e797704ce837\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.261871 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq7s9\" (UniqueName: \"kubernetes.io/projected/cef930bb-c211-441f-a59f-e797704ce837-kube-api-access-bq7s9\") pod \"cef930bb-c211-441f-a59f-e797704ce837\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.261924 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-config-data\") pod \"cef930bb-c211-441f-a59f-e797704ce837\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.261975 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-combined-ca-bundle\") pod \"cef930bb-c211-441f-a59f-e797704ce837\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.262029 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cef930bb-c211-441f-a59f-e797704ce837-logs\") pod \"cef930bb-c211-441f-a59f-e797704ce837\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.262073 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cef930bb-c211-441f-a59f-e797704ce837-httpd-run\") pod \"cef930bb-c211-441f-a59f-e797704ce837\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.262120 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cef930bb-c211-441f-a59f-e797704ce837\" (UID: \"cef930bb-c211-441f-a59f-e797704ce837\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.269814 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef930bb-c211-441f-a59f-e797704ce837-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cef930bb-c211-441f-a59f-e797704ce837" (UID: "cef930bb-c211-441f-a59f-e797704ce837"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.275335 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-scripts" (OuterVolumeSpecName: "scripts") pod "cef930bb-c211-441f-a59f-e797704ce837" (UID: "cef930bb-c211-441f-a59f-e797704ce837"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.281215 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "cef930bb-c211-441f-a59f-e797704ce837" (UID: "cef930bb-c211-441f-a59f-e797704ce837"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.282197 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef930bb-c211-441f-a59f-e797704ce837-logs" (OuterVolumeSpecName: "logs") pod "cef930bb-c211-441f-a59f-e797704ce837" (UID: "cef930bb-c211-441f-a59f-e797704ce837"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.304259 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef930bb-c211-441f-a59f-e797704ce837-kube-api-access-bq7s9" (OuterVolumeSpecName: "kube-api-access-bq7s9") pod "cef930bb-c211-441f-a59f-e797704ce837" (UID: "cef930bb-c211-441f-a59f-e797704ce837"). InnerVolumeSpecName "kube-api-access-bq7s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.345684 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cef930bb-c211-441f-a59f-e797704ce837" (UID: "cef930bb-c211-441f-a59f-e797704ce837"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.353086 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-config-data" (OuterVolumeSpecName: "config-data") pod "cef930bb-c211-441f-a59f-e797704ce837" (UID: "cef930bb-c211-441f-a59f-e797704ce837"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.364194 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.364264 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq7s9\" (UniqueName: \"kubernetes.io/projected/cef930bb-c211-441f-a59f-e797704ce837-kube-api-access-bq7s9\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.364301 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.364312 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.364322 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cef930bb-c211-441f-a59f-e797704ce837-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.364330 4834 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cef930bb-c211-441f-a59f-e797704ce837-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.364350 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.387440 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.406633 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.420356 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cef930bb-c211-441f-a59f-e797704ce837" (UID: "cef930bb-c211-441f-a59f-e797704ce837"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.465836 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-combined-ca-bundle\") pod \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.466300 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9bnp\" (UniqueName: \"kubernetes.io/projected/3553ff9f-9a8d-40bc-919a-f6a400f001f6-kube-api-access-c9bnp\") pod \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.466366 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3553ff9f-9a8d-40bc-919a-f6a400f001f6-logs\") pod \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.466445 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-config-data\") pod \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.466537 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-config-data-custom\") pod \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\" (UID: \"3553ff9f-9a8d-40bc-919a-f6a400f001f6\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.466951 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.466966 4834 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cef930bb-c211-441f-a59f-e797704ce837-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.467277 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3553ff9f-9a8d-40bc-919a-f6a400f001f6-logs" (OuterVolumeSpecName: "logs") pod "3553ff9f-9a8d-40bc-919a-f6a400f001f6" (UID: "3553ff9f-9a8d-40bc-919a-f6a400f001f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.475454 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3553ff9f-9a8d-40bc-919a-f6a400f001f6" (UID: "3553ff9f-9a8d-40bc-919a-f6a400f001f6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.475504 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3553ff9f-9a8d-40bc-919a-f6a400f001f6-kube-api-access-c9bnp" (OuterVolumeSpecName: "kube-api-access-c9bnp") pod "3553ff9f-9a8d-40bc-919a-f6a400f001f6" (UID: "3553ff9f-9a8d-40bc-919a-f6a400f001f6"). InnerVolumeSpecName "kube-api-access-c9bnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.502990 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3553ff9f-9a8d-40bc-919a-f6a400f001f6" (UID: "3553ff9f-9a8d-40bc-919a-f6a400f001f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.548246 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-config-data" (OuterVolumeSpecName: "config-data") pod "3553ff9f-9a8d-40bc-919a-f6a400f001f6" (UID: "3553ff9f-9a8d-40bc-919a-f6a400f001f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.568261 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.568290 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.568300 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9bnp\" (UniqueName: \"kubernetes.io/projected/3553ff9f-9a8d-40bc-919a-f6a400f001f6-kube-api-access-c9bnp\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.568311 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3553ff9f-9a8d-40bc-919a-f6a400f001f6-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.568320 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3553ff9f-9a8d-40bc-919a-f6a400f001f6-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.804778 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.873747 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-config-data\") pod \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.873855 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-scripts\") pod \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.873874 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0cf0722-5c65-4442-9537-9e3d5f5eb262-etc-machine-id\") pod \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.873919 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nt84\" (UniqueName: \"kubernetes.io/projected/a0cf0722-5c65-4442-9537-9e3d5f5eb262-kube-api-access-9nt84\") pod \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.874065 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-combined-ca-bundle\") pod \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.874123 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0cf0722-5c65-4442-9537-9e3d5f5eb262-logs\") pod \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.874167 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-config-data-custom\") pod \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\" (UID: \"a0cf0722-5c65-4442-9537-9e3d5f5eb262\") " Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.874133 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0cf0722-5c65-4442-9537-9e3d5f5eb262-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a0cf0722-5c65-4442-9537-9e3d5f5eb262" (UID: "a0cf0722-5c65-4442-9537-9e3d5f5eb262"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.874846 4834 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0cf0722-5c65-4442-9537-9e3d5f5eb262-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.875928 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0cf0722-5c65-4442-9537-9e3d5f5eb262-logs" (OuterVolumeSpecName: "logs") pod "a0cf0722-5c65-4442-9537-9e3d5f5eb262" (UID: "a0cf0722-5c65-4442-9537-9e3d5f5eb262"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.880199 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a0cf0722-5c65-4442-9537-9e3d5f5eb262" (UID: "a0cf0722-5c65-4442-9537-9e3d5f5eb262"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.883923 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-scripts" (OuterVolumeSpecName: "scripts") pod "a0cf0722-5c65-4442-9537-9e3d5f5eb262" (UID: "a0cf0722-5c65-4442-9537-9e3d5f5eb262"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.900495 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0cf0722-5c65-4442-9537-9e3d5f5eb262-kube-api-access-9nt84" (OuterVolumeSpecName: "kube-api-access-9nt84") pod "a0cf0722-5c65-4442-9537-9e3d5f5eb262" (UID: "a0cf0722-5c65-4442-9537-9e3d5f5eb262"). InnerVolumeSpecName "kube-api-access-9nt84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.959321 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-config-data" (OuterVolumeSpecName: "config-data") pod "a0cf0722-5c65-4442-9537-9e3d5f5eb262" (UID: "a0cf0722-5c65-4442-9537-9e3d5f5eb262"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.977292 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.977323 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nt84\" (UniqueName: \"kubernetes.io/projected/a0cf0722-5c65-4442-9537-9e3d5f5eb262-kube-api-access-9nt84\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.977334 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0cf0722-5c65-4442-9537-9e3d5f5eb262-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.977344 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.977353 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:30 crc kubenswrapper[4834]: I1008 22:44:30.978017 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0cf0722-5c65-4442-9537-9e3d5f5eb262" (UID: "a0cf0722-5c65-4442-9537-9e3d5f5eb262"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.097110 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0cf0722-5c65-4442-9537-9e3d5f5eb262-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.128661 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cef930bb-c211-441f-a59f-e797704ce837","Type":"ContainerDied","Data":"d6babd5298cb2fe00e5a82ee5906f571f8aa371eb5dfe4382284a2497d7d960c"} Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.129043 4834 scope.go:117] "RemoveContainer" containerID="417e9a1a9853a87e4154fd1f997b54fa716f0e33a21ddc62dd44f49568347aaf" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.128702 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.133493 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-787c99cdfb-4zdjc" event={"ID":"3553ff9f-9a8d-40bc-919a-f6a400f001f6","Type":"ContainerDied","Data":"132e0fabcc40740b5e055067a975da306a297f22445cb9111d2434dbedd54ece"} Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.133505 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-787c99cdfb-4zdjc" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.141592 4834 generic.go:334] "Generic (PLEG): container finished" podID="a0cf0722-5c65-4442-9537-9e3d5f5eb262" containerID="1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a" exitCode=0 Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.141629 4834 generic.go:334] "Generic (PLEG): container finished" podID="a0cf0722-5c65-4442-9537-9e3d5f5eb262" containerID="4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f" exitCode=143 Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.141687 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0cf0722-5c65-4442-9537-9e3d5f5eb262","Type":"ContainerDied","Data":"1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a"} Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.141714 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0cf0722-5c65-4442-9537-9e3d5f5eb262","Type":"ContainerDied","Data":"4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f"} Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.141725 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a0cf0722-5c65-4442-9537-9e3d5f5eb262","Type":"ContainerDied","Data":"64bb81f51d6d028512e6af6b3fa2d7631efebc3c7166dd733a6a7333ea864ee9"} Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.141788 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.151312 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="ceilometer-central-agent" containerID="cri-o://cc045e43247a37d4a44ed55adfaf9586f56217a8ef6fd2d786ddc2a902bb8299" gracePeriod=30 Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.151823 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="proxy-httpd" containerID="cri-o://d083678f5ed26a71eacdcaadb0d7d43293b629fd6d360371306fb6b2c57fb7d9" gracePeriod=30 Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.151877 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="sg-core" containerID="cri-o://cdbfd918937b6685ea1d96b1cec28360fa1081ade005c89ae879d54bc06f6228" gracePeriod=30 Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.151911 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="ceilometer-notification-agent" containerID="cri-o://aa03ee65e2ccb19d2281ae90f69f5e354c069d95c67dae21cec48157be39120a" gracePeriod=30 Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.151974 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05900eae-0749-428d-9c98-65b20ccaef25","Type":"ContainerStarted","Data":"d083678f5ed26a71eacdcaadb0d7d43293b629fd6d360371306fb6b2c57fb7d9"} Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.152003 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.180311 4834 scope.go:117] "RemoveContainer" containerID="8e857802b4273bb3fc6019e44989d03f7abc803eb79bcd62026051a8aaa24e78" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.184725 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.264492503 podStartE2EDuration="8.184708548s" podCreationTimestamp="2025-10-08 22:44:23 +0000 UTC" firstStartedPulling="2025-10-08 22:44:24.815550957 +0000 UTC m=+1272.638435703" lastFinishedPulling="2025-10-08 22:44:29.735767002 +0000 UTC m=+1277.558651748" observedRunningTime="2025-10-08 22:44:31.172044831 +0000 UTC m=+1278.994929577" watchObservedRunningTime="2025-10-08 22:44:31.184708548 +0000 UTC m=+1279.007593284" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.207275 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.229093 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.241187 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-787c99cdfb-4zdjc"] Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.242850 4834 scope.go:117] "RemoveContainer" containerID="7ecc30acf1a932b56177f5900a0493e3740b2d798a4342e2a322d6c870308405" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.249279 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-787c99cdfb-4zdjc"] Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.263958 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:44:31 crc kubenswrapper[4834]: E1008 22:44:31.264351 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0cf0722-5c65-4442-9537-9e3d5f5eb262" containerName="cinder-api-log" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.264362 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0cf0722-5c65-4442-9537-9e3d5f5eb262" containerName="cinder-api-log" Oct 08 22:44:31 crc kubenswrapper[4834]: E1008 22:44:31.264376 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3553ff9f-9a8d-40bc-919a-f6a400f001f6" containerName="barbican-api" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.264383 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3553ff9f-9a8d-40bc-919a-f6a400f001f6" containerName="barbican-api" Oct 08 22:44:31 crc kubenswrapper[4834]: E1008 22:44:31.264399 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef930bb-c211-441f-a59f-e797704ce837" containerName="glance-log" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.264405 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef930bb-c211-441f-a59f-e797704ce837" containerName="glance-log" Oct 08 22:44:31 crc kubenswrapper[4834]: E1008 22:44:31.264414 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cef930bb-c211-441f-a59f-e797704ce837" containerName="glance-httpd" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.264420 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef930bb-c211-441f-a59f-e797704ce837" containerName="glance-httpd" Oct 08 22:44:31 crc kubenswrapper[4834]: E1008 22:44:31.264434 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0cf0722-5c65-4442-9537-9e3d5f5eb262" containerName="cinder-api" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.264439 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0cf0722-5c65-4442-9537-9e3d5f5eb262" containerName="cinder-api" Oct 08 22:44:31 crc kubenswrapper[4834]: E1008 22:44:31.264464 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3553ff9f-9a8d-40bc-919a-f6a400f001f6" containerName="barbican-api-log" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.264471 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="3553ff9f-9a8d-40bc-919a-f6a400f001f6" containerName="barbican-api-log" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.264633 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="3553ff9f-9a8d-40bc-919a-f6a400f001f6" containerName="barbican-api-log" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.264651 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0cf0722-5c65-4442-9537-9e3d5f5eb262" containerName="cinder-api-log" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.264663 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="3553ff9f-9a8d-40bc-919a-f6a400f001f6" containerName="barbican-api" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.264670 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef930bb-c211-441f-a59f-e797704ce837" containerName="glance-httpd" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.264679 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="cef930bb-c211-441f-a59f-e797704ce837" containerName="glance-log" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.264688 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0cf0722-5c65-4442-9537-9e3d5f5eb262" containerName="cinder-api" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.265612 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.268806 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.268928 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.268956 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.273896 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.282615 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.295938 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.312710 4834 scope.go:117] "RemoveContainer" containerID="7a7b33fee922cee7f61655aff3ab1dedaeaacce62bdf135a48bf5992989ac4bc" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.315744 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.317508 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.319250 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.320414 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.337434 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.380307 4834 scope.go:117] "RemoveContainer" containerID="1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.405087 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.405340 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1c297e1-ec55-4113-a87d-7813a27c03d9-logs\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.405380 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.405489 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.405602 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.405685 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fede876-b44b-40e1-8c56-9c35d2528e37-logs\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.405721 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-config-data\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.405779 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-config-data-custom\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.406117 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjmgb\" (UniqueName: \"kubernetes.io/projected/f1c297e1-ec55-4113-a87d-7813a27c03d9-kube-api-access-xjmgb\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.406209 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxlmm\" (UniqueName: \"kubernetes.io/projected/2fede876-b44b-40e1-8c56-9c35d2528e37-kube-api-access-fxlmm\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.406265 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fede876-b44b-40e1-8c56-9c35d2528e37-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.406328 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.406394 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-scripts\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.406510 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.406551 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.406664 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.406684 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f1c297e1-ec55-4113-a87d-7813a27c03d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.410038 4834 scope.go:117] "RemoveContainer" containerID="4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.438577 4834 scope.go:117] "RemoveContainer" containerID="1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a" Oct 08 22:44:31 crc kubenswrapper[4834]: E1008 22:44:31.439802 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a\": container with ID starting with 1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a not found: ID does not exist" containerID="1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.439868 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a"} err="failed to get container status \"1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a\": rpc error: code = NotFound desc = could not find container \"1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a\": container with ID starting with 1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a not found: ID does not exist" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.439893 4834 scope.go:117] "RemoveContainer" containerID="4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f" Oct 08 22:44:31 crc kubenswrapper[4834]: E1008 22:44:31.440492 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f\": container with ID starting with 4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f not found: ID does not exist" containerID="4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.440542 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f"} err="failed to get container status \"4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f\": rpc error: code = NotFound desc = could not find container \"4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f\": container with ID starting with 4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f not found: ID does not exist" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.440604 4834 scope.go:117] "RemoveContainer" containerID="1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.441028 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a"} err="failed to get container status \"1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a\": rpc error: code = NotFound desc = could not find container \"1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a\": container with ID starting with 1d749aa7de9c1612b8c065e69f8c9db331f455daaa5cd0ddaad32c65a2b2b25a not found: ID does not exist" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.441056 4834 scope.go:117] "RemoveContainer" containerID="4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.441331 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f"} err="failed to get container status \"4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f\": rpc error: code = NotFound desc = could not find container \"4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f\": container with ID starting with 4c74fee1b87ce2ed89631241f8eea868778c5a2b00e75adc8040b8e015988c2f not found: ID does not exist" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.507929 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fede876-b44b-40e1-8c56-9c35d2528e37-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.507975 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.507997 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-scripts\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.508015 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fede876-b44b-40e1-8c56-9c35d2528e37-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.508032 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.508106 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.508216 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.508233 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f1c297e1-ec55-4113-a87d-7813a27c03d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.508294 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.508374 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1c297e1-ec55-4113-a87d-7813a27c03d9-logs\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.508430 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.508728 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.508854 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1c297e1-ec55-4113-a87d-7813a27c03d9-logs\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.509022 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.509115 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.509153 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fede876-b44b-40e1-8c56-9c35d2528e37-logs\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.509204 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-config-data\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.509224 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f1c297e1-ec55-4113-a87d-7813a27c03d9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.509248 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-config-data-custom\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.509318 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjmgb\" (UniqueName: \"kubernetes.io/projected/f1c297e1-ec55-4113-a87d-7813a27c03d9-kube-api-access-xjmgb\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.509363 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxlmm\" (UniqueName: \"kubernetes.io/projected/2fede876-b44b-40e1-8c56-9c35d2528e37-kube-api-access-fxlmm\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.510184 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fede876-b44b-40e1-8c56-9c35d2528e37-logs\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.515602 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-scripts\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.516574 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.517883 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.518004 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-scripts\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.518841 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-config-data\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.518977 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-config-data\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.521673 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.526872 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxlmm\" (UniqueName: \"kubernetes.io/projected/2fede876-b44b-40e1-8c56-9c35d2528e37-kube-api-access-fxlmm\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.529094 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.529768 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjmgb\" (UniqueName: \"kubernetes.io/projected/f1c297e1-ec55-4113-a87d-7813a27c03d9-kube-api-access-xjmgb\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.536590 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.536752 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-config-data-custom\") pod \"cinder-api-0\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.603611 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " pod="openstack/glance-default-external-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.606708 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3553ff9f-9a8d-40bc-919a-f6a400f001f6" path="/var/lib/kubelet/pods/3553ff9f-9a8d-40bc-919a-f6a400f001f6/volumes" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.607660 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0cf0722-5c65-4442-9537-9e3d5f5eb262" path="/var/lib/kubelet/pods/a0cf0722-5c65-4442-9537-9e3d5f5eb262/volumes" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.608957 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cef930bb-c211-441f-a59f-e797704ce837" path="/var/lib/kubelet/pods/cef930bb-c211-441f-a59f-e797704ce837/volumes" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.609288 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 22:44:31 crc kubenswrapper[4834]: I1008 22:44:31.647189 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:44:32 crc kubenswrapper[4834]: I1008 22:44:32.187991 4834 generic.go:334] "Generic (PLEG): container finished" podID="05900eae-0749-428d-9c98-65b20ccaef25" containerID="d083678f5ed26a71eacdcaadb0d7d43293b629fd6d360371306fb6b2c57fb7d9" exitCode=0 Oct 08 22:44:32 crc kubenswrapper[4834]: I1008 22:44:32.188748 4834 generic.go:334] "Generic (PLEG): container finished" podID="05900eae-0749-428d-9c98-65b20ccaef25" containerID="cdbfd918937b6685ea1d96b1cec28360fa1081ade005c89ae879d54bc06f6228" exitCode=2 Oct 08 22:44:32 crc kubenswrapper[4834]: I1008 22:44:32.188762 4834 generic.go:334] "Generic (PLEG): container finished" podID="05900eae-0749-428d-9c98-65b20ccaef25" containerID="aa03ee65e2ccb19d2281ae90f69f5e354c069d95c67dae21cec48157be39120a" exitCode=0 Oct 08 22:44:32 crc kubenswrapper[4834]: I1008 22:44:32.188027 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05900eae-0749-428d-9c98-65b20ccaef25","Type":"ContainerDied","Data":"d083678f5ed26a71eacdcaadb0d7d43293b629fd6d360371306fb6b2c57fb7d9"} Oct 08 22:44:32 crc kubenswrapper[4834]: I1008 22:44:32.188864 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05900eae-0749-428d-9c98-65b20ccaef25","Type":"ContainerDied","Data":"cdbfd918937b6685ea1d96b1cec28360fa1081ade005c89ae879d54bc06f6228"} Oct 08 22:44:32 crc kubenswrapper[4834]: I1008 22:44:32.188901 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05900eae-0749-428d-9c98-65b20ccaef25","Type":"ContainerDied","Data":"aa03ee65e2ccb19d2281ae90f69f5e354c069d95c67dae21cec48157be39120a"} Oct 08 22:44:32 crc kubenswrapper[4834]: I1008 22:44:32.216861 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:44:32 crc kubenswrapper[4834]: I1008 22:44:32.358719 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:44:32 crc kubenswrapper[4834]: W1008 22:44:32.362078 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1c297e1_ec55_4113_a87d_7813a27c03d9.slice/crio-0e92c2f238ed646df6d52d13fbf2e27b633698504120bf398e47a90cb5276806 WatchSource:0}: Error finding container 0e92c2f238ed646df6d52d13fbf2e27b633698504120bf398e47a90cb5276806: Status 404 returned error can't find the container with id 0e92c2f238ed646df6d52d13fbf2e27b633698504120bf398e47a90cb5276806 Oct 08 22:44:33 crc kubenswrapper[4834]: I1008 22:44:33.209370 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fede876-b44b-40e1-8c56-9c35d2528e37","Type":"ContainerStarted","Data":"69bc22ed35e6a76565637096b43955534881f7b3f617bfa8208087f9e3cad9e3"} Oct 08 22:44:33 crc kubenswrapper[4834]: I1008 22:44:33.210307 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fede876-b44b-40e1-8c56-9c35d2528e37","Type":"ContainerStarted","Data":"10b85818c43ca0acb9f142c960fa5273d09c7f7893f29c91852fbdf61fa330c8"} Oct 08 22:44:33 crc kubenswrapper[4834]: I1008 22:44:33.213630 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f1c297e1-ec55-4113-a87d-7813a27c03d9","Type":"ContainerStarted","Data":"354d0197ed5738529f4ce14ae4d167d9d0a781f57eca46b2a301712e02875868"} Oct 08 22:44:33 crc kubenswrapper[4834]: I1008 22:44:33.213677 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f1c297e1-ec55-4113-a87d-7813a27c03d9","Type":"ContainerStarted","Data":"0e92c2f238ed646df6d52d13fbf2e27b633698504120bf398e47a90cb5276806"} Oct 08 22:44:34 crc kubenswrapper[4834]: I1008 22:44:34.232448 4834 generic.go:334] "Generic (PLEG): container finished" podID="b34caaaa-9ad3-42b8-8537-876601474580" containerID="3d96780e3f5bf5401fcf705807bcc64c49484592f8ee3396cf64804cc4dd5a27" exitCode=0 Oct 08 22:44:34 crc kubenswrapper[4834]: I1008 22:44:34.232527 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b34caaaa-9ad3-42b8-8537-876601474580","Type":"ContainerDied","Data":"3d96780e3f5bf5401fcf705807bcc64c49484592f8ee3396cf64804cc4dd5a27"} Oct 08 22:44:34 crc kubenswrapper[4834]: I1008 22:44:34.235739 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f1c297e1-ec55-4113-a87d-7813a27c03d9","Type":"ContainerStarted","Data":"7af54c4f4905381853f354524b17a405dd2c9d5ab3d098a361ea339c61a15d5d"} Oct 08 22:44:34 crc kubenswrapper[4834]: I1008 22:44:34.238429 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fede876-b44b-40e1-8c56-9c35d2528e37","Type":"ContainerStarted","Data":"577329ac5e86fe36588abe9f509038517d2bba0f083da6b616ddb28e28603822"} Oct 08 22:44:34 crc kubenswrapper[4834]: I1008 22:44:34.238547 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 08 22:44:34 crc kubenswrapper[4834]: I1008 22:44:34.269666 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.269647686 podStartE2EDuration="3.269647686s" podCreationTimestamp="2025-10-08 22:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:44:34.264718377 +0000 UTC m=+1282.087603143" watchObservedRunningTime="2025-10-08 22:44:34.269647686 +0000 UTC m=+1282.092532422" Oct 08 22:44:35 crc kubenswrapper[4834]: I1008 22:44:35.259040 4834 generic.go:334] "Generic (PLEG): container finished" podID="05900eae-0749-428d-9c98-65b20ccaef25" containerID="cc045e43247a37d4a44ed55adfaf9586f56217a8ef6fd2d786ddc2a902bb8299" exitCode=0 Oct 08 22:44:35 crc kubenswrapper[4834]: I1008 22:44:35.259573 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05900eae-0749-428d-9c98-65b20ccaef25","Type":"ContainerDied","Data":"cc045e43247a37d4a44ed55adfaf9586f56217a8ef6fd2d786ddc2a902bb8299"} Oct 08 22:44:35 crc kubenswrapper[4834]: I1008 22:44:35.504413 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 08 22:44:35 crc kubenswrapper[4834]: I1008 22:44:35.515303 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:44:35 crc kubenswrapper[4834]: I1008 22:44:35.526898 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.526879132 podStartE2EDuration="4.526879132s" podCreationTimestamp="2025-10-08 22:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:44:34.29331138 +0000 UTC m=+1282.116196136" watchObservedRunningTime="2025-10-08 22:44:35.526879132 +0000 UTC m=+1283.349763878" Oct 08 22:44:35 crc kubenswrapper[4834]: I1008 22:44:35.552406 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:44:35 crc kubenswrapper[4834]: I1008 22:44:35.609420 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77dd5cf987-ns26j"] Oct 08 22:44:35 crc kubenswrapper[4834]: I1008 22:44:35.609711 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" podUID="22c740c5-2fd7-47bf-b34a-e4df82a1c970" containerName="dnsmasq-dns" containerID="cri-o://cc29667f1378c23afae6203d0838cde7a3a89a414594717058bb98319b018457" gracePeriod=10 Oct 08 22:44:36 crc kubenswrapper[4834]: I1008 22:44:36.270576 4834 generic.go:334] "Generic (PLEG): container finished" podID="22c740c5-2fd7-47bf-b34a-e4df82a1c970" containerID="cc29667f1378c23afae6203d0838cde7a3a89a414594717058bb98319b018457" exitCode=0 Oct 08 22:44:36 crc kubenswrapper[4834]: I1008 22:44:36.270657 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" event={"ID":"22c740c5-2fd7-47bf-b34a-e4df82a1c970","Type":"ContainerDied","Data":"cc29667f1378c23afae6203d0838cde7a3a89a414594717058bb98319b018457"} Oct 08 22:44:36 crc kubenswrapper[4834]: I1008 22:44:36.271131 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" containerName="cinder-scheduler" containerID="cri-o://2ab134505a31e4a3aecfbd84147c1a11da7f89069d5db0e00164907b6204e8df" gracePeriod=30 Oct 08 22:44:36 crc kubenswrapper[4834]: I1008 22:44:36.271175 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" containerName="probe" containerID="cri-o://334dbc351804b123ccb2ad2c99ced8191afb6411583d0b48bc1eb6e511322b49" gracePeriod=30 Oct 08 22:44:37 crc kubenswrapper[4834]: I1008 22:44:37.280404 4834 generic.go:334] "Generic (PLEG): container finished" podID="ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" containerID="334dbc351804b123ccb2ad2c99ced8191afb6411583d0b48bc1eb6e511322b49" exitCode=0 Oct 08 22:44:37 crc kubenswrapper[4834]: I1008 22:44:37.280430 4834 generic.go:334] "Generic (PLEG): container finished" podID="ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" containerID="2ab134505a31e4a3aecfbd84147c1a11da7f89069d5db0e00164907b6204e8df" exitCode=0 Oct 08 22:44:37 crc kubenswrapper[4834]: I1008 22:44:37.280473 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c","Type":"ContainerDied","Data":"334dbc351804b123ccb2ad2c99ced8191afb6411583d0b48bc1eb6e511322b49"} Oct 08 22:44:37 crc kubenswrapper[4834]: I1008 22:44:37.280500 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c","Type":"ContainerDied","Data":"2ab134505a31e4a3aecfbd84147c1a11da7f89069d5db0e00164907b6204e8df"} Oct 08 22:44:37 crc kubenswrapper[4834]: I1008 22:44:37.872961 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" podUID="22c740c5-2fd7-47bf-b34a-e4df82a1c970" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: connect: connection refused" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.603932 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.661029 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b34caaaa-9ad3-42b8-8537-876601474580-httpd-run\") pod \"b34caaaa-9ad3-42b8-8537-876601474580\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.661106 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b34caaaa-9ad3-42b8-8537-876601474580-logs\") pod \"b34caaaa-9ad3-42b8-8537-876601474580\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.661207 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-combined-ca-bundle\") pod \"b34caaaa-9ad3-42b8-8537-876601474580\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.661266 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft264\" (UniqueName: \"kubernetes.io/projected/b34caaaa-9ad3-42b8-8537-876601474580-kube-api-access-ft264\") pod \"b34caaaa-9ad3-42b8-8537-876601474580\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.661311 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-internal-tls-certs\") pod \"b34caaaa-9ad3-42b8-8537-876601474580\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.661329 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"b34caaaa-9ad3-42b8-8537-876601474580\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.661358 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-config-data\") pod \"b34caaaa-9ad3-42b8-8537-876601474580\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.661435 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-scripts\") pod \"b34caaaa-9ad3-42b8-8537-876601474580\" (UID: \"b34caaaa-9ad3-42b8-8537-876601474580\") " Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.661534 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b34caaaa-9ad3-42b8-8537-876601474580-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b34caaaa-9ad3-42b8-8537-876601474580" (UID: "b34caaaa-9ad3-42b8-8537-876601474580"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.661875 4834 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b34caaaa-9ad3-42b8-8537-876601474580-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.664263 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b34caaaa-9ad3-42b8-8537-876601474580-logs" (OuterVolumeSpecName: "logs") pod "b34caaaa-9ad3-42b8-8537-876601474580" (UID: "b34caaaa-9ad3-42b8-8537-876601474580"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.672219 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "b34caaaa-9ad3-42b8-8537-876601474580" (UID: "b34caaaa-9ad3-42b8-8537-876601474580"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.672807 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b34caaaa-9ad3-42b8-8537-876601474580-kube-api-access-ft264" (OuterVolumeSpecName: "kube-api-access-ft264") pod "b34caaaa-9ad3-42b8-8537-876601474580" (UID: "b34caaaa-9ad3-42b8-8537-876601474580"). InnerVolumeSpecName "kube-api-access-ft264". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.683384 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-scripts" (OuterVolumeSpecName: "scripts") pod "b34caaaa-9ad3-42b8-8537-876601474580" (UID: "b34caaaa-9ad3-42b8-8537-876601474580"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.739780 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b34caaaa-9ad3-42b8-8537-876601474580" (UID: "b34caaaa-9ad3-42b8-8537-876601474580"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.763364 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.763917 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b34caaaa-9ad3-42b8-8537-876601474580-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.763978 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.766324 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft264\" (UniqueName: \"kubernetes.io/projected/b34caaaa-9ad3-42b8-8537-876601474580-kube-api-access-ft264\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.766256 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b34caaaa-9ad3-42b8-8537-876601474580" (UID: "b34caaaa-9ad3-42b8-8537-876601474580"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.766416 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.789783 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.806223 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-config-data" (OuterVolumeSpecName: "config-data") pod "b34caaaa-9ad3-42b8-8537-876601474580" (UID: "b34caaaa-9ad3-42b8-8537-876601474580"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.925549 4834 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.925571 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:38 crc kubenswrapper[4834]: I1008 22:44:38.925579 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b34caaaa-9ad3-42b8-8537-876601474580-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.131777 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.136287 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.145986 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.230238 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-combined-ca-bundle\") pod \"05900eae-0749-428d-9c98-65b20ccaef25\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.230427 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-sg-core-conf-yaml\") pod \"05900eae-0749-428d-9c98-65b20ccaef25\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.230453 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvppl\" (UniqueName: \"kubernetes.io/projected/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-kube-api-access-kvppl\") pod \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.230509 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-scripts\") pod \"05900eae-0749-428d-9c98-65b20ccaef25\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.230565 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05900eae-0749-428d-9c98-65b20ccaef25-run-httpd\") pod \"05900eae-0749-428d-9c98-65b20ccaef25\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.230580 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-scripts\") pod \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.230688 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-config-data-custom\") pod \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.230707 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-config-data\") pod \"05900eae-0749-428d-9c98-65b20ccaef25\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.230735 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-combined-ca-bundle\") pod \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.230754 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05900eae-0749-428d-9c98-65b20ccaef25-log-httpd\") pod \"05900eae-0749-428d-9c98-65b20ccaef25\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.230784 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-config-data\") pod \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.230825 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwwfk\" (UniqueName: \"kubernetes.io/projected/05900eae-0749-428d-9c98-65b20ccaef25-kube-api-access-xwwfk\") pod \"05900eae-0749-428d-9c98-65b20ccaef25\" (UID: \"05900eae-0749-428d-9c98-65b20ccaef25\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.230849 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-etc-machine-id\") pod \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\" (UID: \"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.231218 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" (UID: "ae93bf6b-cba0-4e57-8686-ab3d694cfc3c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.232264 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05900eae-0749-428d-9c98-65b20ccaef25-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "05900eae-0749-428d-9c98-65b20ccaef25" (UID: "05900eae-0749-428d-9c98-65b20ccaef25"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.236315 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05900eae-0749-428d-9c98-65b20ccaef25-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "05900eae-0749-428d-9c98-65b20ccaef25" (UID: "05900eae-0749-428d-9c98-65b20ccaef25"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.239637 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" (UID: "ae93bf6b-cba0-4e57-8686-ab3d694cfc3c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.241031 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-scripts" (OuterVolumeSpecName: "scripts") pod "05900eae-0749-428d-9c98-65b20ccaef25" (UID: "05900eae-0749-428d-9c98-65b20ccaef25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.244106 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-scripts" (OuterVolumeSpecName: "scripts") pod "ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" (UID: "ae93bf6b-cba0-4e57-8686-ab3d694cfc3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.245549 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-kube-api-access-kvppl" (OuterVolumeSpecName: "kube-api-access-kvppl") pod "ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" (UID: "ae93bf6b-cba0-4e57-8686-ab3d694cfc3c"). InnerVolumeSpecName "kube-api-access-kvppl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.259336 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05900eae-0749-428d-9c98-65b20ccaef25-kube-api-access-xwwfk" (OuterVolumeSpecName: "kube-api-access-xwwfk") pod "05900eae-0749-428d-9c98-65b20ccaef25" (UID: "05900eae-0749-428d-9c98-65b20ccaef25"). InnerVolumeSpecName "kube-api-access-xwwfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.289037 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" (UID: "ae93bf6b-cba0-4e57-8686-ab3d694cfc3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.299656 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" event={"ID":"22c740c5-2fd7-47bf-b34a-e4df82a1c970","Type":"ContainerDied","Data":"b58d03a3fff83c365590c44f6006233a20d8efc6ecb0cc5961d3aa00211e24d4"} Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.299707 4834 scope.go:117] "RemoveContainer" containerID="cc29667f1378c23afae6203d0838cde7a3a89a414594717058bb98319b018457" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.299819 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77dd5cf987-ns26j" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.302366 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ae93bf6b-cba0-4e57-8686-ab3d694cfc3c","Type":"ContainerDied","Data":"a505dd8a292b30869f8be60bf8c684cecb1cde6fd3d8ac051f7235125f810b2a"} Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.302421 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.303841 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b34caaaa-9ad3-42b8-8537-876601474580","Type":"ContainerDied","Data":"653fa47fc8457b754c433323c3245942fc4829eb9dfa7977f529500398d68ad0"} Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.303897 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.316853 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05900eae-0749-428d-9c98-65b20ccaef25","Type":"ContainerDied","Data":"bc956c570db48b8cef2f7cedecc9281d4ea6cbb3af15e6de70c00d43a32d59ce"} Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.317031 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.321993 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "05900eae-0749-428d-9c98-65b20ccaef25" (UID: "05900eae-0749-428d-9c98-65b20ccaef25"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.329678 4834 scope.go:117] "RemoveContainer" containerID="33e4af26bbf4130eedb59c9e1f95713af28585725ac37d9b6497b7c23f8f115f" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.334037 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-dns-swift-storage-0\") pod \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.334086 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-config\") pod \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.334203 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-ovsdbserver-sb\") pod \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.334447 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr7tg\" (UniqueName: \"kubernetes.io/projected/22c740c5-2fd7-47bf-b34a-e4df82a1c970-kube-api-access-tr7tg\") pod \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.334478 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-dns-svc\") pod \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.334522 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-ovsdbserver-nb\") pod \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\" (UID: \"22c740c5-2fd7-47bf-b34a-e4df82a1c970\") " Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.335129 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05900eae-0749-428d-9c98-65b20ccaef25-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.335192 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.335202 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.335211 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.335218 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05900eae-0749-428d-9c98-65b20ccaef25-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.335227 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwwfk\" (UniqueName: \"kubernetes.io/projected/05900eae-0749-428d-9c98-65b20ccaef25-kube-api-access-xwwfk\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.335237 4834 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.335245 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.335253 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvppl\" (UniqueName: \"kubernetes.io/projected/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-kube-api-access-kvppl\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.335263 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.338399 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.346005 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.351186 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-config-data" (OuterVolumeSpecName: "config-data") pod "ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" (UID: "ae93bf6b-cba0-4e57-8686-ab3d694cfc3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.366478 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c740c5-2fd7-47bf-b34a-e4df82a1c970-kube-api-access-tr7tg" (OuterVolumeSpecName: "kube-api-access-tr7tg") pod "22c740c5-2fd7-47bf-b34a-e4df82a1c970" (UID: "22c740c5-2fd7-47bf-b34a-e4df82a1c970"). InnerVolumeSpecName "kube-api-access-tr7tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.369216 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:44:39 crc kubenswrapper[4834]: E1008 22:44:39.369703 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="proxy-httpd" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.369724 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="proxy-httpd" Oct 08 22:44:39 crc kubenswrapper[4834]: E1008 22:44:39.369742 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" containerName="cinder-scheduler" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.369752 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" containerName="cinder-scheduler" Oct 08 22:44:39 crc kubenswrapper[4834]: E1008 22:44:39.369768 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="ceilometer-central-agent" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.369775 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="ceilometer-central-agent" Oct 08 22:44:39 crc kubenswrapper[4834]: E1008 22:44:39.369789 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="ceilometer-notification-agent" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.369797 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="ceilometer-notification-agent" Oct 08 22:44:39 crc kubenswrapper[4834]: E1008 22:44:39.369809 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="sg-core" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.369816 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="sg-core" Oct 08 22:44:39 crc kubenswrapper[4834]: E1008 22:44:39.369830 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34caaaa-9ad3-42b8-8537-876601474580" containerName="glance-log" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.369837 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34caaaa-9ad3-42b8-8537-876601474580" containerName="glance-log" Oct 08 22:44:39 crc kubenswrapper[4834]: E1008 22:44:39.369852 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c740c5-2fd7-47bf-b34a-e4df82a1c970" containerName="init" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.369859 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c740c5-2fd7-47bf-b34a-e4df82a1c970" containerName="init" Oct 08 22:44:39 crc kubenswrapper[4834]: E1008 22:44:39.369874 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c740c5-2fd7-47bf-b34a-e4df82a1c970" containerName="dnsmasq-dns" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.369884 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c740c5-2fd7-47bf-b34a-e4df82a1c970" containerName="dnsmasq-dns" Oct 08 22:44:39 crc kubenswrapper[4834]: E1008 22:44:39.369896 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" containerName="probe" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.369904 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" containerName="probe" Oct 08 22:44:39 crc kubenswrapper[4834]: E1008 22:44:39.369923 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34caaaa-9ad3-42b8-8537-876601474580" containerName="glance-httpd" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.369933 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34caaaa-9ad3-42b8-8537-876601474580" containerName="glance-httpd" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.370182 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="ceilometer-central-agent" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.370200 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="ceilometer-notification-agent" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.370219 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" containerName="probe" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.370230 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="proxy-httpd" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.370246 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="05900eae-0749-428d-9c98-65b20ccaef25" containerName="sg-core" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.370269 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b34caaaa-9ad3-42b8-8537-876601474580" containerName="glance-log" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.370280 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b34caaaa-9ad3-42b8-8537-876601474580" containerName="glance-httpd" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.370294 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c740c5-2fd7-47bf-b34a-e4df82a1c970" containerName="dnsmasq-dns" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.370307 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" containerName="cinder-scheduler" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.371476 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.377245 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.384317 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.387411 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.393181 4834 scope.go:117] "RemoveContainer" containerID="334dbc351804b123ccb2ad2c99ced8191afb6411583d0b48bc1eb6e511322b49" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.416769 4834 scope.go:117] "RemoveContainer" containerID="2ab134505a31e4a3aecfbd84147c1a11da7f89069d5db0e00164907b6204e8df" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.437009 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.437352 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr7tg\" (UniqueName: \"kubernetes.io/projected/22c740c5-2fd7-47bf-b34a-e4df82a1c970-kube-api-access-tr7tg\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.440250 4834 scope.go:117] "RemoveContainer" containerID="3d96780e3f5bf5401fcf705807bcc64c49484592f8ee3396cf64804cc4dd5a27" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.441810 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05900eae-0749-428d-9c98-65b20ccaef25" (UID: "05900eae-0749-428d-9c98-65b20ccaef25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.451653 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-config" (OuterVolumeSpecName: "config") pod "22c740c5-2fd7-47bf-b34a-e4df82a1c970" (UID: "22c740c5-2fd7-47bf-b34a-e4df82a1c970"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.451970 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "22c740c5-2fd7-47bf-b34a-e4df82a1c970" (UID: "22c740c5-2fd7-47bf-b34a-e4df82a1c970"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.452858 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "22c740c5-2fd7-47bf-b34a-e4df82a1c970" (UID: "22c740c5-2fd7-47bf-b34a-e4df82a1c970"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.453311 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22c740c5-2fd7-47bf-b34a-e4df82a1c970" (UID: "22c740c5-2fd7-47bf-b34a-e4df82a1c970"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.458117 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-config-data" (OuterVolumeSpecName: "config-data") pod "05900eae-0749-428d-9c98-65b20ccaef25" (UID: "05900eae-0749-428d-9c98-65b20ccaef25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.458239 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "22c740c5-2fd7-47bf-b34a-e4df82a1c970" (UID: "22c740c5-2fd7-47bf-b34a-e4df82a1c970"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.539848 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.539903 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37143980-a3f8-4398-a1d7-0f8189fb5366-logs\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.539931 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.539947 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-scripts\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.540012 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-config-data\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.540041 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.540125 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqlz4\" (UniqueName: \"kubernetes.io/projected/37143980-a3f8-4398-a1d7-0f8189fb5366-kube-api-access-vqlz4\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.540179 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37143980-a3f8-4398-a1d7-0f8189fb5366-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.540243 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.540253 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.540264 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.540272 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.540300 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.540308 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22c740c5-2fd7-47bf-b34a-e4df82a1c970-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.540316 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05900eae-0749-428d-9c98-65b20ccaef25-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.568881 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b34caaaa-9ad3-42b8-8537-876601474580" path="/var/lib/kubelet/pods/b34caaaa-9ad3-42b8-8537-876601474580/volumes" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.617043 4834 scope.go:117] "RemoveContainer" containerID="dc5e2e8417ace5c999280b76594b36eb8fbcb6af9691cc8cce6e64f0a2c22444" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.632411 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77dd5cf987-ns26j"] Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.640248 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77dd5cf987-ns26j"] Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.641535 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.641672 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqlz4\" (UniqueName: \"kubernetes.io/projected/37143980-a3f8-4398-a1d7-0f8189fb5366-kube-api-access-vqlz4\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.641766 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37143980-a3f8-4398-a1d7-0f8189fb5366-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.641893 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.641957 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37143980-a3f8-4398-a1d7-0f8189fb5366-logs\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.642009 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-scripts\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.642054 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.642099 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-config-data\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.642597 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37143980-a3f8-4398-a1d7-0f8189fb5366-logs\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.642817 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37143980-a3f8-4398-a1d7-0f8189fb5366-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.645123 4834 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.660925 4834 scope.go:117] "RemoveContainer" containerID="d083678f5ed26a71eacdcaadb0d7d43293b629fd6d360371306fb6b2c57fb7d9" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.665219 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-scripts\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.666113 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.669642 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.673086 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.685458 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.686817 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-config-data\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.688957 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqlz4\" (UniqueName: \"kubernetes.io/projected/37143980-a3f8-4398-a1d7-0f8189fb5366-kube-api-access-vqlz4\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.709751 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.717193 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " pod="openstack/glance-default-internal-api-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.720457 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.753136 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.759877 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.762780 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.777581 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.779742 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.785239 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.785555 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.787124 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.803552 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.812893 4834 scope.go:117] "RemoveContainer" containerID="cdbfd918937b6685ea1d96b1cec28360fa1081ade005c89ae879d54bc06f6228" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.835597 4834 scope.go:117] "RemoveContainer" containerID="aa03ee65e2ccb19d2281ae90f69f5e354c069d95c67dae21cec48157be39120a" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.859045 4834 scope.go:117] "RemoveContainer" containerID="cc045e43247a37d4a44ed55adfaf9586f56217a8ef6fd2d786ddc2a902bb8299" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.950262 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-scripts\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.950302 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l82nh\" (UniqueName: \"kubernetes.io/projected/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-kube-api-access-l82nh\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.950348 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-scripts\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.950375 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.950410 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-run-httpd\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.950430 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-log-httpd\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.950444 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-config-data\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.950462 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.950479 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.950504 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-config-data\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.950519 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.950543 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.950562 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jphv\" (UniqueName: \"kubernetes.io/projected/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-kube-api-access-8jphv\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:39 crc kubenswrapper[4834]: I1008 22:44:39.994560 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.051869 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-config-data\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.051912 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.051948 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.051979 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jphv\" (UniqueName: \"kubernetes.io/projected/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-kube-api-access-8jphv\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.052066 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l82nh\" (UniqueName: \"kubernetes.io/projected/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-kube-api-access-l82nh\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.052087 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-scripts\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.052159 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-scripts\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.052201 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.052249 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-run-httpd\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.052274 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-log-httpd\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.052560 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.052710 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-run-httpd\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.052811 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-config-data\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.052853 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.052887 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.053088 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-log-httpd\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.057376 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.057711 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.057960 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-config-data\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.058187 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.059873 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-config-data\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.061226 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.063409 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-scripts\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.063668 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-scripts\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.075441 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jphv\" (UniqueName: \"kubernetes.io/projected/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-kube-api-access-8jphv\") pod \"ceilometer-0\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.082202 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l82nh\" (UniqueName: \"kubernetes.io/projected/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-kube-api-access-l82nh\") pod \"cinder-scheduler-0\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " pod="openstack/cinder-scheduler-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.118924 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.127792 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.330186 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zp2gc" event={"ID":"30f60664-0524-49bc-8e17-19305b2ae60a","Type":"ContainerStarted","Data":"ce5764d2183a5398012423761f4ca8e1f6b29ff026fd012d325298c4b1b7f24d"} Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.353807 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-zp2gc" podStartSLOduration=2.433534342 podStartE2EDuration="13.353789624s" podCreationTimestamp="2025-10-08 22:44:27 +0000 UTC" firstStartedPulling="2025-10-08 22:44:28.332687589 +0000 UTC m=+1276.155572325" lastFinishedPulling="2025-10-08 22:44:39.252942861 +0000 UTC m=+1287.075827607" observedRunningTime="2025-10-08 22:44:40.348235869 +0000 UTC m=+1288.171120615" watchObservedRunningTime="2025-10-08 22:44:40.353789624 +0000 UTC m=+1288.176674370" Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.618208 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.665225 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:40 crc kubenswrapper[4834]: W1008 22:44:40.672830 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ba8f9be_99cb_4173_aa3b_f8ba2aabb57c.slice/crio-55a61ae21161f808e2370ed0ef72b15c1c6f81d7ea2a20b590d081b02eb38a04 WatchSource:0}: Error finding container 55a61ae21161f808e2370ed0ef72b15c1c6f81d7ea2a20b590d081b02eb38a04: Status 404 returned error can't find the container with id 55a61ae21161f808e2370ed0ef72b15c1c6f81d7ea2a20b590d081b02eb38a04 Oct 08 22:44:40 crc kubenswrapper[4834]: I1008 22:44:40.675363 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:44:41 crc kubenswrapper[4834]: I1008 22:44:41.361858 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c","Type":"ContainerStarted","Data":"0caa48090b97f4cd0f143f8b3522146daa146a232e6629762e868998fa0cbab2"} Oct 08 22:44:41 crc kubenswrapper[4834]: I1008 22:44:41.362321 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c","Type":"ContainerStarted","Data":"55a61ae21161f808e2370ed0ef72b15c1c6f81d7ea2a20b590d081b02eb38a04"} Oct 08 22:44:41 crc kubenswrapper[4834]: I1008 22:44:41.363562 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24","Type":"ContainerStarted","Data":"9793cb4bcff61f1ab82fc59ac56cade2e841c411358a191267a3e73799412c12"} Oct 08 22:44:41 crc kubenswrapper[4834]: I1008 22:44:41.374578 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37143980-a3f8-4398-a1d7-0f8189fb5366","Type":"ContainerStarted","Data":"97de4609bf212569f354a8db48765bc289647b7d03ecd1100224cb7a89ad47c3"} Oct 08 22:44:41 crc kubenswrapper[4834]: I1008 22:44:41.374632 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37143980-a3f8-4398-a1d7-0f8189fb5366","Type":"ContainerStarted","Data":"6ef5a229e7d07f00f199279a38c613785b06ec9ce380dac470d56e2748fe1027"} Oct 08 22:44:41 crc kubenswrapper[4834]: I1008 22:44:41.575925 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05900eae-0749-428d-9c98-65b20ccaef25" path="/var/lib/kubelet/pods/05900eae-0749-428d-9c98-65b20ccaef25/volumes" Oct 08 22:44:41 crc kubenswrapper[4834]: I1008 22:44:41.576982 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c740c5-2fd7-47bf-b34a-e4df82a1c970" path="/var/lib/kubelet/pods/22c740c5-2fd7-47bf-b34a-e4df82a1c970/volumes" Oct 08 22:44:41 crc kubenswrapper[4834]: I1008 22:44:41.577553 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae93bf6b-cba0-4e57-8686-ab3d694cfc3c" path="/var/lib/kubelet/pods/ae93bf6b-cba0-4e57-8686-ab3d694cfc3c/volumes" Oct 08 22:44:41 crc kubenswrapper[4834]: I1008 22:44:41.647714 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 22:44:41 crc kubenswrapper[4834]: I1008 22:44:41.648177 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 08 22:44:41 crc kubenswrapper[4834]: I1008 22:44:41.712547 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 22:44:41 crc kubenswrapper[4834]: I1008 22:44:41.745741 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 08 22:44:41 crc kubenswrapper[4834]: I1008 22:44:41.770176 4834 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod389aef32-a13f-4210-889b-5f2eb59fae52"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod389aef32-a13f-4210-889b-5f2eb59fae52] : Timed out while waiting for systemd to remove kubepods-besteffort-pod389aef32_a13f_4210_889b_5f2eb59fae52.slice" Oct 08 22:44:42 crc kubenswrapper[4834]: I1008 22:44:42.385567 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24","Type":"ContainerStarted","Data":"931e2e15ae08cd606c4b1909d9af73b31b736b5967a9661b5a33ce70824b6250"} Oct 08 22:44:42 crc kubenswrapper[4834]: I1008 22:44:42.386237 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24","Type":"ContainerStarted","Data":"99555f22588f3e5a3703bce154b58f987fbb7fb7ada408b3f54fa2d3cebf3756"} Oct 08 22:44:42 crc kubenswrapper[4834]: I1008 22:44:42.388134 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37143980-a3f8-4398-a1d7-0f8189fb5366","Type":"ContainerStarted","Data":"c3f0ee497d77bf25a33bb1a3381c77e479b17a2eac69da1fa447bfb0e183a0e4"} Oct 08 22:44:42 crc kubenswrapper[4834]: I1008 22:44:42.391896 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c","Type":"ContainerStarted","Data":"16284da8c6f35291d6b17fb6dfd3cb470e7f9c4ec4f746f4bc0659d2fb85b5fb"} Oct 08 22:44:42 crc kubenswrapper[4834]: I1008 22:44:42.391947 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 22:44:42 crc kubenswrapper[4834]: I1008 22:44:42.392109 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 08 22:44:42 crc kubenswrapper[4834]: I1008 22:44:42.412649 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.412630163 podStartE2EDuration="3.412630163s" podCreationTimestamp="2025-10-08 22:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:44:42.405971411 +0000 UTC m=+1290.228856157" watchObservedRunningTime="2025-10-08 22:44:42.412630163 +0000 UTC m=+1290.235514909" Oct 08 22:44:42 crc kubenswrapper[4834]: I1008 22:44:42.468120 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.468100289 podStartE2EDuration="3.468100289s" podCreationTimestamp="2025-10-08 22:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:44:42.445994622 +0000 UTC m=+1290.268879368" watchObservedRunningTime="2025-10-08 22:44:42.468100289 +0000 UTC m=+1290.290985035" Oct 08 22:44:43 crc kubenswrapper[4834]: I1008 22:44:43.413097 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24","Type":"ContainerStarted","Data":"6591b26188149fd43dbc65b994d2c7c3004d2b3e847b382500bfb0489b59067f"} Oct 08 22:44:43 crc kubenswrapper[4834]: I1008 22:44:43.935588 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 08 22:44:44 crc kubenswrapper[4834]: I1008 22:44:44.421423 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 22:44:44 crc kubenswrapper[4834]: I1008 22:44:44.421797 4834 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 08 22:44:44 crc kubenswrapper[4834]: I1008 22:44:44.627946 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 22:44:45 crc kubenswrapper[4834]: I1008 22:44:45.047472 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 08 22:44:45 crc kubenswrapper[4834]: I1008 22:44:45.119241 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 08 22:44:45 crc kubenswrapper[4834]: I1008 22:44:45.382788 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:45 crc kubenswrapper[4834]: I1008 22:44:45.432258 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24","Type":"ContainerStarted","Data":"56f03632132db10588ea1cb614885813bce3179e661c7637420c61a4026b9be2"} Oct 08 22:44:45 crc kubenswrapper[4834]: I1008 22:44:45.432369 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:44:45 crc kubenswrapper[4834]: I1008 22:44:45.474617 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.652729421 podStartE2EDuration="6.474601784s" podCreationTimestamp="2025-10-08 22:44:39 +0000 UTC" firstStartedPulling="2025-10-08 22:44:40.669252725 +0000 UTC m=+1288.492137501" lastFinishedPulling="2025-10-08 22:44:44.491125118 +0000 UTC m=+1292.314009864" observedRunningTime="2025-10-08 22:44:45.472196976 +0000 UTC m=+1293.295081722" watchObservedRunningTime="2025-10-08 22:44:45.474601784 +0000 UTC m=+1293.297486530" Oct 08 22:44:46 crc kubenswrapper[4834]: I1008 22:44:46.440748 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="ceilometer-central-agent" containerID="cri-o://99555f22588f3e5a3703bce154b58f987fbb7fb7ada408b3f54fa2d3cebf3756" gracePeriod=30 Oct 08 22:44:46 crc kubenswrapper[4834]: I1008 22:44:46.441169 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="proxy-httpd" containerID="cri-o://56f03632132db10588ea1cb614885813bce3179e661c7637420c61a4026b9be2" gracePeriod=30 Oct 08 22:44:46 crc kubenswrapper[4834]: I1008 22:44:46.441227 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="sg-core" containerID="cri-o://6591b26188149fd43dbc65b994d2c7c3004d2b3e847b382500bfb0489b59067f" gracePeriod=30 Oct 08 22:44:46 crc kubenswrapper[4834]: I1008 22:44:46.441263 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="ceilometer-notification-agent" containerID="cri-o://931e2e15ae08cd606c4b1909d9af73b31b736b5967a9661b5a33ce70824b6250" gracePeriod=30 Oct 08 22:44:47 crc kubenswrapper[4834]: I1008 22:44:47.450784 4834 generic.go:334] "Generic (PLEG): container finished" podID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerID="56f03632132db10588ea1cb614885813bce3179e661c7637420c61a4026b9be2" exitCode=0 Oct 08 22:44:47 crc kubenswrapper[4834]: I1008 22:44:47.452106 4834 generic.go:334] "Generic (PLEG): container finished" podID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerID="6591b26188149fd43dbc65b994d2c7c3004d2b3e847b382500bfb0489b59067f" exitCode=2 Oct 08 22:44:47 crc kubenswrapper[4834]: I1008 22:44:47.452221 4834 generic.go:334] "Generic (PLEG): container finished" podID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerID="931e2e15ae08cd606c4b1909d9af73b31b736b5967a9661b5a33ce70824b6250" exitCode=0 Oct 08 22:44:47 crc kubenswrapper[4834]: I1008 22:44:47.450834 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24","Type":"ContainerDied","Data":"56f03632132db10588ea1cb614885813bce3179e661c7637420c61a4026b9be2"} Oct 08 22:44:47 crc kubenswrapper[4834]: I1008 22:44:47.452366 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24","Type":"ContainerDied","Data":"6591b26188149fd43dbc65b994d2c7c3004d2b3e847b382500bfb0489b59067f"} Oct 08 22:44:47 crc kubenswrapper[4834]: I1008 22:44:47.452455 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24","Type":"ContainerDied","Data":"931e2e15ae08cd606c4b1909d9af73b31b736b5967a9661b5a33ce70824b6250"} Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.246983 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.311908 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-sg-core-conf-yaml\") pod \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.311966 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-log-httpd\") pod \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.312015 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-config-data\") pod \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.312048 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-run-httpd\") pod \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.312083 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jphv\" (UniqueName: \"kubernetes.io/projected/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-kube-api-access-8jphv\") pod \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.312109 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-combined-ca-bundle\") pod \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.312448 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-scripts\") pod \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\" (UID: \"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24\") " Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.312946 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" (UID: "02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.313241 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" (UID: "02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.313765 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.313782 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.331959 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-kube-api-access-8jphv" (OuterVolumeSpecName: "kube-api-access-8jphv") pod "02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" (UID: "02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24"). InnerVolumeSpecName "kube-api-access-8jphv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.344112 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-scripts" (OuterVolumeSpecName: "scripts") pod "02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" (UID: "02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.349261 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" (UID: "02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.400899 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" (UID: "02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.415255 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.415281 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.415294 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jphv\" (UniqueName: \"kubernetes.io/projected/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-kube-api-access-8jphv\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.415304 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.419843 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-config-data" (OuterVolumeSpecName: "config-data") pod "02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" (UID: "02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.462641 4834 generic.go:334] "Generic (PLEG): container finished" podID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerID="99555f22588f3e5a3703bce154b58f987fbb7fb7ada408b3f54fa2d3cebf3756" exitCode=0 Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.462681 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24","Type":"ContainerDied","Data":"99555f22588f3e5a3703bce154b58f987fbb7fb7ada408b3f54fa2d3cebf3756"} Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.462706 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24","Type":"ContainerDied","Data":"9793cb4bcff61f1ab82fc59ac56cade2e841c411358a191267a3e73799412c12"} Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.462722 4834 scope.go:117] "RemoveContainer" containerID="56f03632132db10588ea1cb614885813bce3179e661c7637420c61a4026b9be2" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.462841 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.494183 4834 scope.go:117] "RemoveContainer" containerID="6591b26188149fd43dbc65b994d2c7c3004d2b3e847b382500bfb0489b59067f" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.500301 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.509895 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.527763 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:48 crc kubenswrapper[4834]: E1008 22:44:48.528192 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="sg-core" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.528213 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="sg-core" Oct 08 22:44:48 crc kubenswrapper[4834]: E1008 22:44:48.528227 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="ceilometer-central-agent" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.528236 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="ceilometer-central-agent" Oct 08 22:44:48 crc kubenswrapper[4834]: E1008 22:44:48.528254 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="ceilometer-notification-agent" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.528262 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="ceilometer-notification-agent" Oct 08 22:44:48 crc kubenswrapper[4834]: E1008 22:44:48.528288 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="proxy-httpd" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.528295 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="proxy-httpd" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.533608 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="sg-core" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.533662 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="ceilometer-notification-agent" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.533688 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="ceilometer-central-agent" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.533703 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" containerName="proxy-httpd" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.534171 4834 scope.go:117] "RemoveContainer" containerID="931e2e15ae08cd606c4b1909d9af73b31b736b5967a9661b5a33ce70824b6250" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.537056 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.537763 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.539590 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.540025 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.552168 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.564160 4834 scope.go:117] "RemoveContainer" containerID="99555f22588f3e5a3703bce154b58f987fbb7fb7ada408b3f54fa2d3cebf3756" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.595693 4834 scope.go:117] "RemoveContainer" containerID="56f03632132db10588ea1cb614885813bce3179e661c7637420c61a4026b9be2" Oct 08 22:44:48 crc kubenswrapper[4834]: E1008 22:44:48.596192 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56f03632132db10588ea1cb614885813bce3179e661c7637420c61a4026b9be2\": container with ID starting with 56f03632132db10588ea1cb614885813bce3179e661c7637420c61a4026b9be2 not found: ID does not exist" containerID="56f03632132db10588ea1cb614885813bce3179e661c7637420c61a4026b9be2" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.596235 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56f03632132db10588ea1cb614885813bce3179e661c7637420c61a4026b9be2"} err="failed to get container status \"56f03632132db10588ea1cb614885813bce3179e661c7637420c61a4026b9be2\": rpc error: code = NotFound desc = could not find container \"56f03632132db10588ea1cb614885813bce3179e661c7637420c61a4026b9be2\": container with ID starting with 56f03632132db10588ea1cb614885813bce3179e661c7637420c61a4026b9be2 not found: ID does not exist" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.596264 4834 scope.go:117] "RemoveContainer" containerID="6591b26188149fd43dbc65b994d2c7c3004d2b3e847b382500bfb0489b59067f" Oct 08 22:44:48 crc kubenswrapper[4834]: E1008 22:44:48.596756 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6591b26188149fd43dbc65b994d2c7c3004d2b3e847b382500bfb0489b59067f\": container with ID starting with 6591b26188149fd43dbc65b994d2c7c3004d2b3e847b382500bfb0489b59067f not found: ID does not exist" containerID="6591b26188149fd43dbc65b994d2c7c3004d2b3e847b382500bfb0489b59067f" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.596830 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6591b26188149fd43dbc65b994d2c7c3004d2b3e847b382500bfb0489b59067f"} err="failed to get container status \"6591b26188149fd43dbc65b994d2c7c3004d2b3e847b382500bfb0489b59067f\": rpc error: code = NotFound desc = could not find container \"6591b26188149fd43dbc65b994d2c7c3004d2b3e847b382500bfb0489b59067f\": container with ID starting with 6591b26188149fd43dbc65b994d2c7c3004d2b3e847b382500bfb0489b59067f not found: ID does not exist" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.597015 4834 scope.go:117] "RemoveContainer" containerID="931e2e15ae08cd606c4b1909d9af73b31b736b5967a9661b5a33ce70824b6250" Oct 08 22:44:48 crc kubenswrapper[4834]: E1008 22:44:48.597895 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"931e2e15ae08cd606c4b1909d9af73b31b736b5967a9661b5a33ce70824b6250\": container with ID starting with 931e2e15ae08cd606c4b1909d9af73b31b736b5967a9661b5a33ce70824b6250 not found: ID does not exist" containerID="931e2e15ae08cd606c4b1909d9af73b31b736b5967a9661b5a33ce70824b6250" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.597950 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"931e2e15ae08cd606c4b1909d9af73b31b736b5967a9661b5a33ce70824b6250"} err="failed to get container status \"931e2e15ae08cd606c4b1909d9af73b31b736b5967a9661b5a33ce70824b6250\": rpc error: code = NotFound desc = could not find container \"931e2e15ae08cd606c4b1909d9af73b31b736b5967a9661b5a33ce70824b6250\": container with ID starting with 931e2e15ae08cd606c4b1909d9af73b31b736b5967a9661b5a33ce70824b6250 not found: ID does not exist" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.597977 4834 scope.go:117] "RemoveContainer" containerID="99555f22588f3e5a3703bce154b58f987fbb7fb7ada408b3f54fa2d3cebf3756" Oct 08 22:44:48 crc kubenswrapper[4834]: E1008 22:44:48.598603 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99555f22588f3e5a3703bce154b58f987fbb7fb7ada408b3f54fa2d3cebf3756\": container with ID starting with 99555f22588f3e5a3703bce154b58f987fbb7fb7ada408b3f54fa2d3cebf3756 not found: ID does not exist" containerID="99555f22588f3e5a3703bce154b58f987fbb7fb7ada408b3f54fa2d3cebf3756" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.598648 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99555f22588f3e5a3703bce154b58f987fbb7fb7ada408b3f54fa2d3cebf3756"} err="failed to get container status \"99555f22588f3e5a3703bce154b58f987fbb7fb7ada408b3f54fa2d3cebf3756\": rpc error: code = NotFound desc = could not find container \"99555f22588f3e5a3703bce154b58f987fbb7fb7ada408b3f54fa2d3cebf3756\": container with ID starting with 99555f22588f3e5a3703bce154b58f987fbb7fb7ada408b3f54fa2d3cebf3756 not found: ID does not exist" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.639573 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.639654 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-log-httpd\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.639674 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.639716 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-config-data\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.639990 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf8ln\" (UniqueName: \"kubernetes.io/projected/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-kube-api-access-jf8ln\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.640056 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-scripts\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.640315 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-run-httpd\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.742346 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-run-httpd\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.742473 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.742514 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-log-httpd\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.742531 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.742567 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-config-data\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.742616 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf8ln\" (UniqueName: \"kubernetes.io/projected/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-kube-api-access-jf8ln\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.742643 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-scripts\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.743137 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-run-httpd\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.744105 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-log-httpd\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.747402 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-scripts\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.747438 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-config-data\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.747443 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.747514 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.762973 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf8ln\" (UniqueName: \"kubernetes.io/projected/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-kube-api-access-jf8ln\") pod \"ceilometer-0\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " pod="openstack/ceilometer-0" Oct 08 22:44:48 crc kubenswrapper[4834]: I1008 22:44:48.852540 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:44:49 crc kubenswrapper[4834]: I1008 22:44:49.272287 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:49 crc kubenswrapper[4834]: I1008 22:44:49.325458 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:49 crc kubenswrapper[4834]: I1008 22:44:49.481304 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698","Type":"ContainerStarted","Data":"f03f0a8b271b11d7602dc5877697707503250ee4996a082666f8abc62b9527c6"} Oct 08 22:44:49 crc kubenswrapper[4834]: I1008 22:44:49.567881 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24" path="/var/lib/kubelet/pods/02e8ef7c-c1f2-43b5-8a82-9d2bb30d3a24/volumes" Oct 08 22:44:49 crc kubenswrapper[4834]: I1008 22:44:49.963003 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:44:49 crc kubenswrapper[4834]: I1008 22:44:49.996086 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 22:44:49 crc kubenswrapper[4834]: I1008 22:44:49.996134 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 08 22:44:50 crc kubenswrapper[4834]: I1008 22:44:50.029157 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 22:44:50 crc kubenswrapper[4834]: I1008 22:44:50.049484 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 08 22:44:50 crc kubenswrapper[4834]: I1008 22:44:50.338430 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 08 22:44:50 crc kubenswrapper[4834]: I1008 22:44:50.495742 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698","Type":"ContainerStarted","Data":"f9a2a2f2c950b59d85798c0dd9fe2be40d1e579fe9b730530184499b1d06e20c"} Oct 08 22:44:50 crc kubenswrapper[4834]: I1008 22:44:50.495953 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 22:44:50 crc kubenswrapper[4834]: I1008 22:44:50.495986 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 08 22:44:51 crc kubenswrapper[4834]: I1008 22:44:51.508794 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698","Type":"ContainerStarted","Data":"80e76239bb5848d2f375e4ec3bec614e2bdcc3ee7b932e23f2580a0403806d86"} Oct 08 22:44:51 crc kubenswrapper[4834]: I1008 22:44:51.585592 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:44:51 crc kubenswrapper[4834]: I1008 22:44:51.669819 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5846bc496d-9g2mt"] Oct 08 22:44:51 crc kubenswrapper[4834]: I1008 22:44:51.670326 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5846bc496d-9g2mt" podUID="ad08d0fa-74e3-4211-a991-3e12be132fca" containerName="neutron-api" containerID="cri-o://d83b40373e83582320b645236aea12d90979891ced3452dc0393cc534b68cf06" gracePeriod=30 Oct 08 22:44:51 crc kubenswrapper[4834]: I1008 22:44:51.670774 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5846bc496d-9g2mt" podUID="ad08d0fa-74e3-4211-a991-3e12be132fca" containerName="neutron-httpd" containerID="cri-o://a20ef257a47565feda7278ac014084ef785869b3b364fc58d6f71aabd1ca4509" gracePeriod=30 Oct 08 22:44:52 crc kubenswrapper[4834]: I1008 22:44:52.470180 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 22:44:52 crc kubenswrapper[4834]: I1008 22:44:52.476661 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 08 22:44:52 crc kubenswrapper[4834]: I1008 22:44:52.554678 4834 generic.go:334] "Generic (PLEG): container finished" podID="ad08d0fa-74e3-4211-a991-3e12be132fca" containerID="a20ef257a47565feda7278ac014084ef785869b3b364fc58d6f71aabd1ca4509" exitCode=0 Oct 08 22:44:52 crc kubenswrapper[4834]: I1008 22:44:52.556193 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5846bc496d-9g2mt" event={"ID":"ad08d0fa-74e3-4211-a991-3e12be132fca","Type":"ContainerDied","Data":"a20ef257a47565feda7278ac014084ef785869b3b364fc58d6f71aabd1ca4509"} Oct 08 22:44:53 crc kubenswrapper[4834]: I1008 22:44:53.573096 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698","Type":"ContainerStarted","Data":"f4102c9f99bc2fa332a1a1cff364d42993c84bee9f6e24b91564ede153782502"} Oct 08 22:44:55 crc kubenswrapper[4834]: E1008 22:44:55.509708 4834 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad08d0fa_74e3_4211_a991_3e12be132fca.slice/crio-d83b40373e83582320b645236aea12d90979891ced3452dc0393cc534b68cf06.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad08d0fa_74e3_4211_a991_3e12be132fca.slice/crio-conmon-d83b40373e83582320b645236aea12d90979891ced3452dc0393cc534b68cf06.scope\": RecentStats: unable to find data in memory cache]" Oct 08 22:44:55 crc kubenswrapper[4834]: I1008 22:44:55.595035 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698","Type":"ContainerStarted","Data":"99a6a028b6653860013904289e64a4d8db5130e7bc1f7992864fadcfc6758185"} Oct 08 22:44:55 crc kubenswrapper[4834]: I1008 22:44:55.595181 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="ceilometer-central-agent" containerID="cri-o://f9a2a2f2c950b59d85798c0dd9fe2be40d1e579fe9b730530184499b1d06e20c" gracePeriod=30 Oct 08 22:44:55 crc kubenswrapper[4834]: I1008 22:44:55.595214 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:44:55 crc kubenswrapper[4834]: I1008 22:44:55.595222 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="proxy-httpd" containerID="cri-o://99a6a028b6653860013904289e64a4d8db5130e7bc1f7992864fadcfc6758185" gracePeriod=30 Oct 08 22:44:55 crc kubenswrapper[4834]: I1008 22:44:55.595253 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="ceilometer-notification-agent" containerID="cri-o://80e76239bb5848d2f375e4ec3bec614e2bdcc3ee7b932e23f2580a0403806d86" gracePeriod=30 Oct 08 22:44:55 crc kubenswrapper[4834]: I1008 22:44:55.595253 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="sg-core" containerID="cri-o://f4102c9f99bc2fa332a1a1cff364d42993c84bee9f6e24b91564ede153782502" gracePeriod=30 Oct 08 22:44:55 crc kubenswrapper[4834]: I1008 22:44:55.603853 4834 generic.go:334] "Generic (PLEG): container finished" podID="ad08d0fa-74e3-4211-a991-3e12be132fca" containerID="d83b40373e83582320b645236aea12d90979891ced3452dc0393cc534b68cf06" exitCode=0 Oct 08 22:44:55 crc kubenswrapper[4834]: I1008 22:44:55.603914 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5846bc496d-9g2mt" event={"ID":"ad08d0fa-74e3-4211-a991-3e12be132fca","Type":"ContainerDied","Data":"d83b40373e83582320b645236aea12d90979891ced3452dc0393cc534b68cf06"} Oct 08 22:44:55 crc kubenswrapper[4834]: I1008 22:44:55.626505 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.116442937 podStartE2EDuration="7.626485429s" podCreationTimestamp="2025-10-08 22:44:48 +0000 UTC" firstStartedPulling="2025-10-08 22:44:49.338479087 +0000 UTC m=+1297.161363833" lastFinishedPulling="2025-10-08 22:44:54.848521579 +0000 UTC m=+1302.671406325" observedRunningTime="2025-10-08 22:44:55.61991628 +0000 UTC m=+1303.442801026" watchObservedRunningTime="2025-10-08 22:44:55.626485429 +0000 UTC m=+1303.449370175" Oct 08 22:44:55 crc kubenswrapper[4834]: I1008 22:44:55.965839 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.115561 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-ovndb-tls-certs\") pod \"ad08d0fa-74e3-4211-a991-3e12be132fca\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.115604 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-config\") pod \"ad08d0fa-74e3-4211-a991-3e12be132fca\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.115635 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-httpd-config\") pod \"ad08d0fa-74e3-4211-a991-3e12be132fca\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.115876 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf9jz\" (UniqueName: \"kubernetes.io/projected/ad08d0fa-74e3-4211-a991-3e12be132fca-kube-api-access-xf9jz\") pod \"ad08d0fa-74e3-4211-a991-3e12be132fca\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.115904 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-combined-ca-bundle\") pod \"ad08d0fa-74e3-4211-a991-3e12be132fca\" (UID: \"ad08d0fa-74e3-4211-a991-3e12be132fca\") " Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.124537 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad08d0fa-74e3-4211-a991-3e12be132fca-kube-api-access-xf9jz" (OuterVolumeSpecName: "kube-api-access-xf9jz") pod "ad08d0fa-74e3-4211-a991-3e12be132fca" (UID: "ad08d0fa-74e3-4211-a991-3e12be132fca"). InnerVolumeSpecName "kube-api-access-xf9jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.129401 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ad08d0fa-74e3-4211-a991-3e12be132fca" (UID: "ad08d0fa-74e3-4211-a991-3e12be132fca"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.180802 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-config" (OuterVolumeSpecName: "config") pod "ad08d0fa-74e3-4211-a991-3e12be132fca" (UID: "ad08d0fa-74e3-4211-a991-3e12be132fca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.181960 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad08d0fa-74e3-4211-a991-3e12be132fca" (UID: "ad08d0fa-74e3-4211-a991-3e12be132fca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.214107 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ad08d0fa-74e3-4211-a991-3e12be132fca" (UID: "ad08d0fa-74e3-4211-a991-3e12be132fca"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.219962 4834 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.219994 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf9jz\" (UniqueName: \"kubernetes.io/projected/ad08d0fa-74e3-4211-a991-3e12be132fca-kube-api-access-xf9jz\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.220005 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.220015 4834 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.220023 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad08d0fa-74e3-4211-a991-3e12be132fca-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.616337 4834 generic.go:334] "Generic (PLEG): container finished" podID="30f60664-0524-49bc-8e17-19305b2ae60a" containerID="ce5764d2183a5398012423761f4ca8e1f6b29ff026fd012d325298c4b1b7f24d" exitCode=0 Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.616444 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zp2gc" event={"ID":"30f60664-0524-49bc-8e17-19305b2ae60a","Type":"ContainerDied","Data":"ce5764d2183a5398012423761f4ca8e1f6b29ff026fd012d325298c4b1b7f24d"} Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.625859 4834 generic.go:334] "Generic (PLEG): container finished" podID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerID="99a6a028b6653860013904289e64a4d8db5130e7bc1f7992864fadcfc6758185" exitCode=0 Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.625912 4834 generic.go:334] "Generic (PLEG): container finished" podID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerID="f4102c9f99bc2fa332a1a1cff364d42993c84bee9f6e24b91564ede153782502" exitCode=2 Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.625936 4834 generic.go:334] "Generic (PLEG): container finished" podID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerID="80e76239bb5848d2f375e4ec3bec614e2bdcc3ee7b932e23f2580a0403806d86" exitCode=0 Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.626019 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698","Type":"ContainerDied","Data":"99a6a028b6653860013904289e64a4d8db5130e7bc1f7992864fadcfc6758185"} Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.626067 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698","Type":"ContainerDied","Data":"f4102c9f99bc2fa332a1a1cff364d42993c84bee9f6e24b91564ede153782502"} Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.626093 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698","Type":"ContainerDied","Data":"80e76239bb5848d2f375e4ec3bec614e2bdcc3ee7b932e23f2580a0403806d86"} Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.630305 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5846bc496d-9g2mt" event={"ID":"ad08d0fa-74e3-4211-a991-3e12be132fca","Type":"ContainerDied","Data":"7f7fcb3fed5113e971d22256630f2029c631e9bfd72d57cac053b2e6f4968522"} Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.630354 4834 scope.go:117] "RemoveContainer" containerID="a20ef257a47565feda7278ac014084ef785869b3b364fc58d6f71aabd1ca4509" Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.630401 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5846bc496d-9g2mt" Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.663497 4834 scope.go:117] "RemoveContainer" containerID="d83b40373e83582320b645236aea12d90979891ced3452dc0393cc534b68cf06" Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.683602 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5846bc496d-9g2mt"] Oct 08 22:44:56 crc kubenswrapper[4834]: I1008 22:44:56.691689 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5846bc496d-9g2mt"] Oct 08 22:44:57 crc kubenswrapper[4834]: I1008 22:44:57.569955 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad08d0fa-74e3-4211-a991-3e12be132fca" path="/var/lib/kubelet/pods/ad08d0fa-74e3-4211-a991-3e12be132fca/volumes" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.016389 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.159366 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7d5c\" (UniqueName: \"kubernetes.io/projected/30f60664-0524-49bc-8e17-19305b2ae60a-kube-api-access-s7d5c\") pod \"30f60664-0524-49bc-8e17-19305b2ae60a\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.159457 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-scripts\") pod \"30f60664-0524-49bc-8e17-19305b2ae60a\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.159527 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-config-data\") pod \"30f60664-0524-49bc-8e17-19305b2ae60a\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.159570 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-combined-ca-bundle\") pod \"30f60664-0524-49bc-8e17-19305b2ae60a\" (UID: \"30f60664-0524-49bc-8e17-19305b2ae60a\") " Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.165904 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-scripts" (OuterVolumeSpecName: "scripts") pod "30f60664-0524-49bc-8e17-19305b2ae60a" (UID: "30f60664-0524-49bc-8e17-19305b2ae60a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.186339 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f60664-0524-49bc-8e17-19305b2ae60a-kube-api-access-s7d5c" (OuterVolumeSpecName: "kube-api-access-s7d5c") pod "30f60664-0524-49bc-8e17-19305b2ae60a" (UID: "30f60664-0524-49bc-8e17-19305b2ae60a"). InnerVolumeSpecName "kube-api-access-s7d5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.206422 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30f60664-0524-49bc-8e17-19305b2ae60a" (UID: "30f60664-0524-49bc-8e17-19305b2ae60a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.213949 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-config-data" (OuterVolumeSpecName: "config-data") pod "30f60664-0524-49bc-8e17-19305b2ae60a" (UID: "30f60664-0524-49bc-8e17-19305b2ae60a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.261877 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7d5c\" (UniqueName: \"kubernetes.io/projected/30f60664-0524-49bc-8e17-19305b2ae60a-kube-api-access-s7d5c\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.262331 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.262346 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.262360 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30f60664-0524-49bc-8e17-19305b2ae60a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.660574 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zp2gc" event={"ID":"30f60664-0524-49bc-8e17-19305b2ae60a","Type":"ContainerDied","Data":"af0f3b9a7d893187989ef544705b5f3a340f4f680bd904de95429d1b99a8b8f7"} Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.660624 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af0f3b9a7d893187989ef544705b5f3a340f4f680bd904de95429d1b99a8b8f7" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.660694 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zp2gc" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.780107 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 22:44:58 crc kubenswrapper[4834]: E1008 22:44:58.780636 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f60664-0524-49bc-8e17-19305b2ae60a" containerName="nova-cell0-conductor-db-sync" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.780656 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f60664-0524-49bc-8e17-19305b2ae60a" containerName="nova-cell0-conductor-db-sync" Oct 08 22:44:58 crc kubenswrapper[4834]: E1008 22:44:58.780679 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad08d0fa-74e3-4211-a991-3e12be132fca" containerName="neutron-httpd" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.780687 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad08d0fa-74e3-4211-a991-3e12be132fca" containerName="neutron-httpd" Oct 08 22:44:58 crc kubenswrapper[4834]: E1008 22:44:58.780716 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad08d0fa-74e3-4211-a991-3e12be132fca" containerName="neutron-api" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.780727 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad08d0fa-74e3-4211-a991-3e12be132fca" containerName="neutron-api" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.780935 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad08d0fa-74e3-4211-a991-3e12be132fca" containerName="neutron-httpd" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.780965 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f60664-0524-49bc-8e17-19305b2ae60a" containerName="nova-cell0-conductor-db-sync" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.780981 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad08d0fa-74e3-4211-a991-3e12be132fca" containerName="neutron-api" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.781673 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.783618 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mrcjf" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.788978 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.798110 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.815709 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81641859-a43e-4d35-bc09-f541277c77da-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"81641859-a43e-4d35-bc09-f541277c77da\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.816128 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81641859-a43e-4d35-bc09-f541277c77da-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"81641859-a43e-4d35-bc09-f541277c77da\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.816239 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrjtr\" (UniqueName: \"kubernetes.io/projected/81641859-a43e-4d35-bc09-f541277c77da-kube-api-access-qrjtr\") pod \"nova-cell0-conductor-0\" (UID: \"81641859-a43e-4d35-bc09-f541277c77da\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.917805 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrjtr\" (UniqueName: \"kubernetes.io/projected/81641859-a43e-4d35-bc09-f541277c77da-kube-api-access-qrjtr\") pod \"nova-cell0-conductor-0\" (UID: \"81641859-a43e-4d35-bc09-f541277c77da\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.917922 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81641859-a43e-4d35-bc09-f541277c77da-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"81641859-a43e-4d35-bc09-f541277c77da\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.918014 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81641859-a43e-4d35-bc09-f541277c77da-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"81641859-a43e-4d35-bc09-f541277c77da\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.923647 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81641859-a43e-4d35-bc09-f541277c77da-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"81641859-a43e-4d35-bc09-f541277c77da\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.928011 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81641859-a43e-4d35-bc09-f541277c77da-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"81641859-a43e-4d35-bc09-f541277c77da\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:44:58 crc kubenswrapper[4834]: I1008 22:44:58.937552 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrjtr\" (UniqueName: \"kubernetes.io/projected/81641859-a43e-4d35-bc09-f541277c77da-kube-api-access-qrjtr\") pod \"nova-cell0-conductor-0\" (UID: \"81641859-a43e-4d35-bc09-f541277c77da\") " pod="openstack/nova-cell0-conductor-0" Oct 08 22:44:59 crc kubenswrapper[4834]: I1008 22:44:59.104397 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 22:44:59 crc kubenswrapper[4834]: I1008 22:44:59.605731 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 22:44:59 crc kubenswrapper[4834]: W1008 22:44:59.606532 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81641859_a43e_4d35_bc09_f541277c77da.slice/crio-09fa05b1cd6cbb741f7bb1e530e34dc61992d68a3a175d0ea1ffa982f2c8cd8f WatchSource:0}: Error finding container 09fa05b1cd6cbb741f7bb1e530e34dc61992d68a3a175d0ea1ffa982f2c8cd8f: Status 404 returned error can't find the container with id 09fa05b1cd6cbb741f7bb1e530e34dc61992d68a3a175d0ea1ffa982f2c8cd8f Oct 08 22:44:59 crc kubenswrapper[4834]: I1008 22:44:59.672903 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"81641859-a43e-4d35-bc09-f541277c77da","Type":"ContainerStarted","Data":"09fa05b1cd6cbb741f7bb1e530e34dc61992d68a3a175d0ea1ffa982f2c8cd8f"} Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.154988 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds"] Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.157495 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.160681 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.160998 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.168393 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds"] Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.274881 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3073f21-b4aa-4113-964e-d4e157b9d53e-config-volume\") pod \"collect-profiles-29332725-9mxds\" (UID: \"b3073f21-b4aa-4113-964e-d4e157b9d53e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.274924 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3073f21-b4aa-4113-964e-d4e157b9d53e-secret-volume\") pod \"collect-profiles-29332725-9mxds\" (UID: \"b3073f21-b4aa-4113-964e-d4e157b9d53e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.274976 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqjt5\" (UniqueName: \"kubernetes.io/projected/b3073f21-b4aa-4113-964e-d4e157b9d53e-kube-api-access-vqjt5\") pod \"collect-profiles-29332725-9mxds\" (UID: \"b3073f21-b4aa-4113-964e-d4e157b9d53e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.316922 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.376358 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-sg-core-conf-yaml\") pod \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.376558 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-config-data\") pod \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.376616 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-scripts\") pod \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.376663 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-combined-ca-bundle\") pod \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.376690 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-run-httpd\") pod \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.376726 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-log-httpd\") pod \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.376754 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf8ln\" (UniqueName: \"kubernetes.io/projected/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-kube-api-access-jf8ln\") pod \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\" (UID: \"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698\") " Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.377082 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3073f21-b4aa-4113-964e-d4e157b9d53e-config-volume\") pod \"collect-profiles-29332725-9mxds\" (UID: \"b3073f21-b4aa-4113-964e-d4e157b9d53e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.377129 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3073f21-b4aa-4113-964e-d4e157b9d53e-secret-volume\") pod \"collect-profiles-29332725-9mxds\" (UID: \"b3073f21-b4aa-4113-964e-d4e157b9d53e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.377228 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqjt5\" (UniqueName: \"kubernetes.io/projected/b3073f21-b4aa-4113-964e-d4e157b9d53e-kube-api-access-vqjt5\") pod \"collect-profiles-29332725-9mxds\" (UID: \"b3073f21-b4aa-4113-964e-d4e157b9d53e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.378653 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" (UID: "4497d8c7-cdc4-4ef1-91a3-cf68e2d67698"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.379270 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" (UID: "4497d8c7-cdc4-4ef1-91a3-cf68e2d67698"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.380651 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3073f21-b4aa-4113-964e-d4e157b9d53e-config-volume\") pod \"collect-profiles-29332725-9mxds\" (UID: \"b3073f21-b4aa-4113-964e-d4e157b9d53e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.384871 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3073f21-b4aa-4113-964e-d4e157b9d53e-secret-volume\") pod \"collect-profiles-29332725-9mxds\" (UID: \"b3073f21-b4aa-4113-964e-d4e157b9d53e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.386524 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-kube-api-access-jf8ln" (OuterVolumeSpecName: "kube-api-access-jf8ln") pod "4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" (UID: "4497d8c7-cdc4-4ef1-91a3-cf68e2d67698"). InnerVolumeSpecName "kube-api-access-jf8ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.397584 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-scripts" (OuterVolumeSpecName: "scripts") pod "4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" (UID: "4497d8c7-cdc4-4ef1-91a3-cf68e2d67698"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.404530 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" (UID: "4497d8c7-cdc4-4ef1-91a3-cf68e2d67698"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.407433 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqjt5\" (UniqueName: \"kubernetes.io/projected/b3073f21-b4aa-4113-964e-d4e157b9d53e-kube-api-access-vqjt5\") pod \"collect-profiles-29332725-9mxds\" (UID: \"b3073f21-b4aa-4113-964e-d4e157b9d53e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.463385 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" (UID: "4497d8c7-cdc4-4ef1-91a3-cf68e2d67698"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.478497 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.478529 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.478538 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.478550 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.478559 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.478567 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf8ln\" (UniqueName: \"kubernetes.io/projected/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-kube-api-access-jf8ln\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.491232 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.504927 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-config-data" (OuterVolumeSpecName: "config-data") pod "4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" (UID: "4497d8c7-cdc4-4ef1-91a3-cf68e2d67698"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.580856 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.687028 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"81641859-a43e-4d35-bc09-f541277c77da","Type":"ContainerStarted","Data":"8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4"} Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.687227 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.690697 4834 generic.go:334] "Generic (PLEG): container finished" podID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerID="f9a2a2f2c950b59d85798c0dd9fe2be40d1e579fe9b730530184499b1d06e20c" exitCode=0 Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.690734 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698","Type":"ContainerDied","Data":"f9a2a2f2c950b59d85798c0dd9fe2be40d1e579fe9b730530184499b1d06e20c"} Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.690767 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.690784 4834 scope.go:117] "RemoveContainer" containerID="99a6a028b6653860013904289e64a4d8db5130e7bc1f7992864fadcfc6758185" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.690770 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4497d8c7-cdc4-4ef1-91a3-cf68e2d67698","Type":"ContainerDied","Data":"f03f0a8b271b11d7602dc5877697707503250ee4996a082666f8abc62b9527c6"} Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.713412 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.7133915760000002 podStartE2EDuration="2.713391576s" podCreationTimestamp="2025-10-08 22:44:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:45:00.706557031 +0000 UTC m=+1308.529441787" watchObservedRunningTime="2025-10-08 22:45:00.713391576 +0000 UTC m=+1308.536276322" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.728481 4834 scope.go:117] "RemoveContainer" containerID="f4102c9f99bc2fa332a1a1cff364d42993c84bee9f6e24b91564ede153782502" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.755258 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.777374 4834 scope.go:117] "RemoveContainer" containerID="80e76239bb5848d2f375e4ec3bec614e2bdcc3ee7b932e23f2580a0403806d86" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.780010 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.789246 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:00 crc kubenswrapper[4834]: E1008 22:45:00.789670 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="proxy-httpd" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.789687 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="proxy-httpd" Oct 08 22:45:00 crc kubenswrapper[4834]: E1008 22:45:00.789702 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="ceilometer-central-agent" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.789707 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="ceilometer-central-agent" Oct 08 22:45:00 crc kubenswrapper[4834]: E1008 22:45:00.789735 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="ceilometer-notification-agent" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.789741 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="ceilometer-notification-agent" Oct 08 22:45:00 crc kubenswrapper[4834]: E1008 22:45:00.789748 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="sg-core" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.789755 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="sg-core" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.789911 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="proxy-httpd" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.789922 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="sg-core" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.789933 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="ceilometer-central-agent" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.789946 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" containerName="ceilometer-notification-agent" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.791674 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.794043 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.794604 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.797889 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.798637 4834 scope.go:117] "RemoveContainer" containerID="f9a2a2f2c950b59d85798c0dd9fe2be40d1e579fe9b730530184499b1d06e20c" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.824697 4834 scope.go:117] "RemoveContainer" containerID="99a6a028b6653860013904289e64a4d8db5130e7bc1f7992864fadcfc6758185" Oct 08 22:45:00 crc kubenswrapper[4834]: E1008 22:45:00.825030 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99a6a028b6653860013904289e64a4d8db5130e7bc1f7992864fadcfc6758185\": container with ID starting with 99a6a028b6653860013904289e64a4d8db5130e7bc1f7992864fadcfc6758185 not found: ID does not exist" containerID="99a6a028b6653860013904289e64a4d8db5130e7bc1f7992864fadcfc6758185" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.825149 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99a6a028b6653860013904289e64a4d8db5130e7bc1f7992864fadcfc6758185"} err="failed to get container status \"99a6a028b6653860013904289e64a4d8db5130e7bc1f7992864fadcfc6758185\": rpc error: code = NotFound desc = could not find container \"99a6a028b6653860013904289e64a4d8db5130e7bc1f7992864fadcfc6758185\": container with ID starting with 99a6a028b6653860013904289e64a4d8db5130e7bc1f7992864fadcfc6758185 not found: ID does not exist" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.825274 4834 scope.go:117] "RemoveContainer" containerID="f4102c9f99bc2fa332a1a1cff364d42993c84bee9f6e24b91564ede153782502" Oct 08 22:45:00 crc kubenswrapper[4834]: E1008 22:45:00.825832 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4102c9f99bc2fa332a1a1cff364d42993c84bee9f6e24b91564ede153782502\": container with ID starting with f4102c9f99bc2fa332a1a1cff364d42993c84bee9f6e24b91564ede153782502 not found: ID does not exist" containerID="f4102c9f99bc2fa332a1a1cff364d42993c84bee9f6e24b91564ede153782502" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.825873 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4102c9f99bc2fa332a1a1cff364d42993c84bee9f6e24b91564ede153782502"} err="failed to get container status \"f4102c9f99bc2fa332a1a1cff364d42993c84bee9f6e24b91564ede153782502\": rpc error: code = NotFound desc = could not find container \"f4102c9f99bc2fa332a1a1cff364d42993c84bee9f6e24b91564ede153782502\": container with ID starting with f4102c9f99bc2fa332a1a1cff364d42993c84bee9f6e24b91564ede153782502 not found: ID does not exist" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.825901 4834 scope.go:117] "RemoveContainer" containerID="80e76239bb5848d2f375e4ec3bec614e2bdcc3ee7b932e23f2580a0403806d86" Oct 08 22:45:00 crc kubenswrapper[4834]: E1008 22:45:00.826236 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80e76239bb5848d2f375e4ec3bec614e2bdcc3ee7b932e23f2580a0403806d86\": container with ID starting with 80e76239bb5848d2f375e4ec3bec614e2bdcc3ee7b932e23f2580a0403806d86 not found: ID does not exist" containerID="80e76239bb5848d2f375e4ec3bec614e2bdcc3ee7b932e23f2580a0403806d86" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.826293 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80e76239bb5848d2f375e4ec3bec614e2bdcc3ee7b932e23f2580a0403806d86"} err="failed to get container status \"80e76239bb5848d2f375e4ec3bec614e2bdcc3ee7b932e23f2580a0403806d86\": rpc error: code = NotFound desc = could not find container \"80e76239bb5848d2f375e4ec3bec614e2bdcc3ee7b932e23f2580a0403806d86\": container with ID starting with 80e76239bb5848d2f375e4ec3bec614e2bdcc3ee7b932e23f2580a0403806d86 not found: ID does not exist" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.826330 4834 scope.go:117] "RemoveContainer" containerID="f9a2a2f2c950b59d85798c0dd9fe2be40d1e579fe9b730530184499b1d06e20c" Oct 08 22:45:00 crc kubenswrapper[4834]: E1008 22:45:00.826629 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a2a2f2c950b59d85798c0dd9fe2be40d1e579fe9b730530184499b1d06e20c\": container with ID starting with f9a2a2f2c950b59d85798c0dd9fe2be40d1e579fe9b730530184499b1d06e20c not found: ID does not exist" containerID="f9a2a2f2c950b59d85798c0dd9fe2be40d1e579fe9b730530184499b1d06e20c" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.826723 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a2a2f2c950b59d85798c0dd9fe2be40d1e579fe9b730530184499b1d06e20c"} err="failed to get container status \"f9a2a2f2c950b59d85798c0dd9fe2be40d1e579fe9b730530184499b1d06e20c\": rpc error: code = NotFound desc = could not find container \"f9a2a2f2c950b59d85798c0dd9fe2be40d1e579fe9b730530184499b1d06e20c\": container with ID starting with f9a2a2f2c950b59d85798c0dd9fe2be40d1e579fe9b730530184499b1d06e20c not found: ID does not exist" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.887005 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61bdc81-d45d-44dd-a286-10811d22efa2-run-httpd\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.887102 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-scripts\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.887162 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.887218 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bsmk\" (UniqueName: \"kubernetes.io/projected/c61bdc81-d45d-44dd-a286-10811d22efa2-kube-api-access-4bsmk\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.887270 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-config-data\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.887341 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.887393 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61bdc81-d45d-44dd-a286-10811d22efa2-log-httpd\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.965976 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds"] Oct 08 22:45:00 crc kubenswrapper[4834]: W1008 22:45:00.970431 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3073f21_b4aa_4113_964e_d4e157b9d53e.slice/crio-4e5c6e114bda83e644c7070cf44767f5671c23c9d2dbbc9464d0907a5d5081c7 WatchSource:0}: Error finding container 4e5c6e114bda83e644c7070cf44767f5671c23c9d2dbbc9464d0907a5d5081c7: Status 404 returned error can't find the container with id 4e5c6e114bda83e644c7070cf44767f5671c23c9d2dbbc9464d0907a5d5081c7 Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.988905 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61bdc81-d45d-44dd-a286-10811d22efa2-log-httpd\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.989034 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61bdc81-d45d-44dd-a286-10811d22efa2-run-httpd\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.989075 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-scripts\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.989108 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.989173 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bsmk\" (UniqueName: \"kubernetes.io/projected/c61bdc81-d45d-44dd-a286-10811d22efa2-kube-api-access-4bsmk\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.989200 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-config-data\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.989241 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.989416 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61bdc81-d45d-44dd-a286-10811d22efa2-log-httpd\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.989588 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61bdc81-d45d-44dd-a286-10811d22efa2-run-httpd\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.994810 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-scripts\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.996786 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:00 crc kubenswrapper[4834]: I1008 22:45:00.997181 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-config-data\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:01 crc kubenswrapper[4834]: I1008 22:45:01.001134 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:01 crc kubenswrapper[4834]: I1008 22:45:01.004255 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bsmk\" (UniqueName: \"kubernetes.io/projected/c61bdc81-d45d-44dd-a286-10811d22efa2-kube-api-access-4bsmk\") pod \"ceilometer-0\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " pod="openstack/ceilometer-0" Oct 08 22:45:01 crc kubenswrapper[4834]: I1008 22:45:01.118104 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:45:01 crc kubenswrapper[4834]: I1008 22:45:01.574386 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4497d8c7-cdc4-4ef1-91a3-cf68e2d67698" path="/var/lib/kubelet/pods/4497d8c7-cdc4-4ef1-91a3-cf68e2d67698/volumes" Oct 08 22:45:01 crc kubenswrapper[4834]: W1008 22:45:01.588247 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc61bdc81_d45d_44dd_a286_10811d22efa2.slice/crio-cddbc6d8532e3193d1e703a2192ae63d80da3ac7610233514ba01dcc5811e2ec WatchSource:0}: Error finding container cddbc6d8532e3193d1e703a2192ae63d80da3ac7610233514ba01dcc5811e2ec: Status 404 returned error can't find the container with id cddbc6d8532e3193d1e703a2192ae63d80da3ac7610233514ba01dcc5811e2ec Oct 08 22:45:01 crc kubenswrapper[4834]: I1008 22:45:01.604904 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:01 crc kubenswrapper[4834]: I1008 22:45:01.703448 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61bdc81-d45d-44dd-a286-10811d22efa2","Type":"ContainerStarted","Data":"cddbc6d8532e3193d1e703a2192ae63d80da3ac7610233514ba01dcc5811e2ec"} Oct 08 22:45:01 crc kubenswrapper[4834]: I1008 22:45:01.705729 4834 generic.go:334] "Generic (PLEG): container finished" podID="b3073f21-b4aa-4113-964e-d4e157b9d53e" containerID="045bfd6e47c7cb04456f3f30e28a9fb501037bf64345af689d7236e746207bdb" exitCode=0 Oct 08 22:45:01 crc kubenswrapper[4834]: I1008 22:45:01.706081 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" event={"ID":"b3073f21-b4aa-4113-964e-d4e157b9d53e","Type":"ContainerDied","Data":"045bfd6e47c7cb04456f3f30e28a9fb501037bf64345af689d7236e746207bdb"} Oct 08 22:45:01 crc kubenswrapper[4834]: I1008 22:45:01.706106 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" event={"ID":"b3073f21-b4aa-4113-964e-d4e157b9d53e","Type":"ContainerStarted","Data":"4e5c6e114bda83e644c7070cf44767f5671c23c9d2dbbc9464d0907a5d5081c7"} Oct 08 22:45:02 crc kubenswrapper[4834]: I1008 22:45:02.715863 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61bdc81-d45d-44dd-a286-10811d22efa2","Type":"ContainerStarted","Data":"1bc1d76011d94b5ead51f990a8ced182f57e49d4ceefb6a78b1334ed5940b7a9"} Oct 08 22:45:03 crc kubenswrapper[4834]: I1008 22:45:03.061543 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" Oct 08 22:45:03 crc kubenswrapper[4834]: I1008 22:45:03.139242 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3073f21-b4aa-4113-964e-d4e157b9d53e-secret-volume\") pod \"b3073f21-b4aa-4113-964e-d4e157b9d53e\" (UID: \"b3073f21-b4aa-4113-964e-d4e157b9d53e\") " Oct 08 22:45:03 crc kubenswrapper[4834]: I1008 22:45:03.139274 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqjt5\" (UniqueName: \"kubernetes.io/projected/b3073f21-b4aa-4113-964e-d4e157b9d53e-kube-api-access-vqjt5\") pod \"b3073f21-b4aa-4113-964e-d4e157b9d53e\" (UID: \"b3073f21-b4aa-4113-964e-d4e157b9d53e\") " Oct 08 22:45:03 crc kubenswrapper[4834]: I1008 22:45:03.139336 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3073f21-b4aa-4113-964e-d4e157b9d53e-config-volume\") pod \"b3073f21-b4aa-4113-964e-d4e157b9d53e\" (UID: \"b3073f21-b4aa-4113-964e-d4e157b9d53e\") " Oct 08 22:45:03 crc kubenswrapper[4834]: I1008 22:45:03.140492 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3073f21-b4aa-4113-964e-d4e157b9d53e-config-volume" (OuterVolumeSpecName: "config-volume") pod "b3073f21-b4aa-4113-964e-d4e157b9d53e" (UID: "b3073f21-b4aa-4113-964e-d4e157b9d53e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:45:03 crc kubenswrapper[4834]: I1008 22:45:03.143890 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3073f21-b4aa-4113-964e-d4e157b9d53e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b3073f21-b4aa-4113-964e-d4e157b9d53e" (UID: "b3073f21-b4aa-4113-964e-d4e157b9d53e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:03 crc kubenswrapper[4834]: I1008 22:45:03.148485 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3073f21-b4aa-4113-964e-d4e157b9d53e-kube-api-access-vqjt5" (OuterVolumeSpecName: "kube-api-access-vqjt5") pod "b3073f21-b4aa-4113-964e-d4e157b9d53e" (UID: "b3073f21-b4aa-4113-964e-d4e157b9d53e"). InnerVolumeSpecName "kube-api-access-vqjt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:03 crc kubenswrapper[4834]: I1008 22:45:03.241840 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3073f21-b4aa-4113-964e-d4e157b9d53e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:03 crc kubenswrapper[4834]: I1008 22:45:03.242009 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqjt5\" (UniqueName: \"kubernetes.io/projected/b3073f21-b4aa-4113-964e-d4e157b9d53e-kube-api-access-vqjt5\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:03 crc kubenswrapper[4834]: I1008 22:45:03.242066 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3073f21-b4aa-4113-964e-d4e157b9d53e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:03 crc kubenswrapper[4834]: I1008 22:45:03.736412 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" event={"ID":"b3073f21-b4aa-4113-964e-d4e157b9d53e","Type":"ContainerDied","Data":"4e5c6e114bda83e644c7070cf44767f5671c23c9d2dbbc9464d0907a5d5081c7"} Oct 08 22:45:03 crc kubenswrapper[4834]: I1008 22:45:03.736892 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e5c6e114bda83e644c7070cf44767f5671c23c9d2dbbc9464d0907a5d5081c7" Oct 08 22:45:03 crc kubenswrapper[4834]: I1008 22:45:03.736439 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds" Oct 08 22:45:03 crc kubenswrapper[4834]: I1008 22:45:03.739818 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61bdc81-d45d-44dd-a286-10811d22efa2","Type":"ContainerStarted","Data":"90dbfe82b44f65cc0540ee995c758612740536708b5b33999e93e7decd7f1461"} Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.136050 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.592645 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-rx46c"] Oct 08 22:45:04 crc kubenswrapper[4834]: E1008 22:45:04.595721 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3073f21-b4aa-4113-964e-d4e157b9d53e" containerName="collect-profiles" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.595829 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3073f21-b4aa-4113-964e-d4e157b9d53e" containerName="collect-profiles" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.596242 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3073f21-b4aa-4113-964e-d4e157b9d53e" containerName="collect-profiles" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.597221 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.601039 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rx46c"] Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.605841 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.606129 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.699588 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-config-data\") pod \"nova-cell0-cell-mapping-rx46c\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.699641 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-scripts\") pod \"nova-cell0-cell-mapping-rx46c\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.699676 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rx46c\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.699720 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8j44\" (UniqueName: \"kubernetes.io/projected/c708dd7c-f12f-49bf-a622-74b33227c62f-kube-api-access-q8j44\") pod \"nova-cell0-cell-mapping-rx46c\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.754686 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61bdc81-d45d-44dd-a286-10811d22efa2","Type":"ContainerStarted","Data":"d9b4ddc679f848c2a87d22c9b36af537436a290dddee3124cc669e12dc609eaa"} Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.803182 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-config-data\") pod \"nova-cell0-cell-mapping-rx46c\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.803292 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-scripts\") pod \"nova-cell0-cell-mapping-rx46c\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.803371 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rx46c\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.803505 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8j44\" (UniqueName: \"kubernetes.io/projected/c708dd7c-f12f-49bf-a622-74b33227c62f-kube-api-access-q8j44\") pod \"nova-cell0-cell-mapping-rx46c\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.831768 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-scripts\") pod \"nova-cell0-cell-mapping-rx46c\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.832883 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rx46c\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.833984 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-config-data\") pod \"nova-cell0-cell-mapping-rx46c\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.876745 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.876840 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8j44\" (UniqueName: \"kubernetes.io/projected/c708dd7c-f12f-49bf-a622-74b33227c62f-kube-api-access-q8j44\") pod \"nova-cell0-cell-mapping-rx46c\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.886868 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.902425 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.920462 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.934644 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.935966 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.938761 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:45:04 crc kubenswrapper[4834]: I1008 22:45:04.940501 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:04.999220 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.007148 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14fc6262-b8ce-4cd4-927f-b78590123bd9-config-data\") pod \"nova-api-0\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " pod="openstack/nova-api-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.007285 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14fc6262-b8ce-4cd4-927f-b78590123bd9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " pod="openstack/nova-api-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.007335 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb961381-3f4f-4192-b7de-098fd403ff53-config-data\") pod \"nova-scheduler-0\" (UID: \"cb961381-3f4f-4192-b7de-098fd403ff53\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.007438 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb961381-3f4f-4192-b7de-098fd403ff53-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cb961381-3f4f-4192-b7de-098fd403ff53\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.007520 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14fc6262-b8ce-4cd4-927f-b78590123bd9-logs\") pod \"nova-api-0\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " pod="openstack/nova-api-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.007605 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnvhz\" (UniqueName: \"kubernetes.io/projected/cb961381-3f4f-4192-b7de-098fd403ff53-kube-api-access-bnvhz\") pod \"nova-scheduler-0\" (UID: \"cb961381-3f4f-4192-b7de-098fd403ff53\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.007643 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk5xw\" (UniqueName: \"kubernetes.io/projected/14fc6262-b8ce-4cd4-927f-b78590123bd9-kube-api-access-zk5xw\") pod \"nova-api-0\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " pod="openstack/nova-api-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.110728 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb961381-3f4f-4192-b7de-098fd403ff53-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cb961381-3f4f-4192-b7de-098fd403ff53\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.111086 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14fc6262-b8ce-4cd4-927f-b78590123bd9-logs\") pod \"nova-api-0\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " pod="openstack/nova-api-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.111133 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnvhz\" (UniqueName: \"kubernetes.io/projected/cb961381-3f4f-4192-b7de-098fd403ff53-kube-api-access-bnvhz\") pod \"nova-scheduler-0\" (UID: \"cb961381-3f4f-4192-b7de-098fd403ff53\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.111171 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk5xw\" (UniqueName: \"kubernetes.io/projected/14fc6262-b8ce-4cd4-927f-b78590123bd9-kube-api-access-zk5xw\") pod \"nova-api-0\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " pod="openstack/nova-api-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.111226 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14fc6262-b8ce-4cd4-927f-b78590123bd9-config-data\") pod \"nova-api-0\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " pod="openstack/nova-api-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.111279 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14fc6262-b8ce-4cd4-927f-b78590123bd9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " pod="openstack/nova-api-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.111300 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb961381-3f4f-4192-b7de-098fd403ff53-config-data\") pod \"nova-scheduler-0\" (UID: \"cb961381-3f4f-4192-b7de-098fd403ff53\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.128456 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb961381-3f4f-4192-b7de-098fd403ff53-config-data\") pod \"nova-scheduler-0\" (UID: \"cb961381-3f4f-4192-b7de-098fd403ff53\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.128520 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.128597 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14fc6262-b8ce-4cd4-927f-b78590123bd9-logs\") pod \"nova-api-0\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " pod="openstack/nova-api-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.128949 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb961381-3f4f-4192-b7de-098fd403ff53-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cb961381-3f4f-4192-b7de-098fd403ff53\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.131197 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14fc6262-b8ce-4cd4-927f-b78590123bd9-config-data\") pod \"nova-api-0\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " pod="openstack/nova-api-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.149143 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.151595 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.162434 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14fc6262-b8ce-4cd4-927f-b78590123bd9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " pod="openstack/nova-api-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.164371 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.204860 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.207707 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk5xw\" (UniqueName: \"kubernetes.io/projected/14fc6262-b8ce-4cd4-927f-b78590123bd9-kube-api-access-zk5xw\") pod \"nova-api-0\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " pod="openstack/nova-api-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.210663 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.212276 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fe0249-e25b-4912-b569-d8b16d8da682-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7fe0249-e25b-4912-b569-d8b16d8da682\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.212325 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fe0249-e25b-4912-b569-d8b16d8da682-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7fe0249-e25b-4912-b569-d8b16d8da682\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.212416 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq4r5\" (UniqueName: \"kubernetes.io/projected/b7fe0249-e25b-4912-b569-d8b16d8da682-kube-api-access-dq4r5\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7fe0249-e25b-4912-b569-d8b16d8da682\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.214284 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnvhz\" (UniqueName: \"kubernetes.io/projected/cb961381-3f4f-4192-b7de-098fd403ff53-kube-api-access-bnvhz\") pod \"nova-scheduler-0\" (UID: \"cb961381-3f4f-4192-b7de-098fd403ff53\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.224166 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.224749 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.305926 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.313769 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fe0249-e25b-4912-b569-d8b16d8da682-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7fe0249-e25b-4912-b569-d8b16d8da682\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.313813 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " pod="openstack/nova-metadata-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.313830 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fe0249-e25b-4912-b569-d8b16d8da682-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7fe0249-e25b-4912-b569-d8b16d8da682\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.313877 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ldns\" (UniqueName: \"kubernetes.io/projected/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-kube-api-access-6ldns\") pod \"nova-metadata-0\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " pod="openstack/nova-metadata-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.313916 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq4r5\" (UniqueName: \"kubernetes.io/projected/b7fe0249-e25b-4912-b569-d8b16d8da682-kube-api-access-dq4r5\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7fe0249-e25b-4912-b569-d8b16d8da682\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.313950 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-logs\") pod \"nova-metadata-0\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " pod="openstack/nova-metadata-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.313973 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-config-data\") pod \"nova-metadata-0\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " pod="openstack/nova-metadata-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.315956 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64d8d96789-wxgvg"] Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.317332 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.330416 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fe0249-e25b-4912-b569-d8b16d8da682-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7fe0249-e25b-4912-b569-d8b16d8da682\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.337758 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fe0249-e25b-4912-b569-d8b16d8da682-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7fe0249-e25b-4912-b569-d8b16d8da682\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.348594 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64d8d96789-wxgvg"] Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.356327 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq4r5\" (UniqueName: \"kubernetes.io/projected/b7fe0249-e25b-4912-b569-d8b16d8da682-kube-api-access-dq4r5\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7fe0249-e25b-4912-b569-d8b16d8da682\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.415458 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-dns-svc\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.415513 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldns\" (UniqueName: \"kubernetes.io/projected/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-kube-api-access-6ldns\") pod \"nova-metadata-0\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " pod="openstack/nova-metadata-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.415558 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmbkl\" (UniqueName: \"kubernetes.io/projected/7cbed812-c266-46ad-9b64-ccb83e6efb76-kube-api-access-vmbkl\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.415602 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-dns-swift-storage-0\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.415620 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-config\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.415657 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-logs\") pod \"nova-metadata-0\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " pod="openstack/nova-metadata-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.415677 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-config-data\") pod \"nova-metadata-0\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " pod="openstack/nova-metadata-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.415709 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-ovsdbserver-nb\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.415773 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " pod="openstack/nova-metadata-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.415801 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-ovsdbserver-sb\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.416891 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-logs\") pod \"nova-metadata-0\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " pod="openstack/nova-metadata-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.420442 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.421992 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-config-data\") pod \"nova-metadata-0\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " pod="openstack/nova-metadata-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.433238 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " pod="openstack/nova-metadata-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.440128 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ldns\" (UniqueName: \"kubernetes.io/projected/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-kube-api-access-6ldns\") pod \"nova-metadata-0\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " pod="openstack/nova-metadata-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.518090 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-ovsdbserver-sb\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.518168 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-dns-svc\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.518209 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmbkl\" (UniqueName: \"kubernetes.io/projected/7cbed812-c266-46ad-9b64-ccb83e6efb76-kube-api-access-vmbkl\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.518241 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-dns-swift-storage-0\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.518262 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-config\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.518307 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-ovsdbserver-nb\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.519252 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-ovsdbserver-nb\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.519733 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-ovsdbserver-sb\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.520788 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-dns-svc\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.521935 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-dns-swift-storage-0\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.523074 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-config\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.539112 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmbkl\" (UniqueName: \"kubernetes.io/projected/7cbed812-c266-46ad-9b64-ccb83e6efb76-kube-api-access-vmbkl\") pod \"dnsmasq-dns-64d8d96789-wxgvg\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.642076 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.667408 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.676700 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.692276 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.718483 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rx46c"] Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.782228 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14fc6262-b8ce-4cd4-927f-b78590123bd9","Type":"ContainerStarted","Data":"339318a0149ac3b3ffa96febb8fd15ab911f0aa751949fa844dca444e58d7780"} Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.795170 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61bdc81-d45d-44dd-a286-10811d22efa2","Type":"ContainerStarted","Data":"cfc0275a1c70a5dc176a8c69b72aa2ee4c3c1bead2405ddbd71e516b638ac75d"} Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.848262 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dl62z"] Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.856694 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.861487 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.866822 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.870428 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dl62z"] Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.926328 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-config-data\") pod \"nova-cell1-conductor-db-sync-dl62z\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.926405 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dl62z\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.926425 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-scripts\") pod \"nova-cell1-conductor-db-sync-dl62z\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:05 crc kubenswrapper[4834]: I1008 22:45:05.926760 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ctnd\" (UniqueName: \"kubernetes.io/projected/4d347a09-693b-4c37-8a0c-5143e16fd9f8-kube-api-access-9ctnd\") pod \"nova-cell1-conductor-db-sync-dl62z\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.020038 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.031894 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ctnd\" (UniqueName: \"kubernetes.io/projected/4d347a09-693b-4c37-8a0c-5143e16fd9f8-kube-api-access-9ctnd\") pod \"nova-cell1-conductor-db-sync-dl62z\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.032004 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-config-data\") pod \"nova-cell1-conductor-db-sync-dl62z\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.032080 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dl62z\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.032109 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-scripts\") pod \"nova-cell1-conductor-db-sync-dl62z\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.037554 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-scripts\") pod \"nova-cell1-conductor-db-sync-dl62z\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.042334 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dl62z\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.050614 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-config-data\") pod \"nova-cell1-conductor-db-sync-dl62z\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.053602 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ctnd\" (UniqueName: \"kubernetes.io/projected/4d347a09-693b-4c37-8a0c-5143e16fd9f8-kube-api-access-9ctnd\") pod \"nova-cell1-conductor-db-sync-dl62z\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.193897 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.323158 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.342215 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64d8d96789-wxgvg"] Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.807396 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb961381-3f4f-4192-b7de-098fd403ff53","Type":"ContainerStarted","Data":"b87d21bc1c047a999322c0ee37c0f655b0ebef7def1e032bfd5e41d0da032f20"} Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.809685 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rx46c" event={"ID":"c708dd7c-f12f-49bf-a622-74b33227c62f","Type":"ContainerStarted","Data":"88b0573350e6ee588acc06ceb59b0e3a08b66ae8628f405a308c2807ec31d4eb"} Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.810648 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:45:06 crc kubenswrapper[4834]: I1008 22:45:06.841942 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.175330503 podStartE2EDuration="6.84192023s" podCreationTimestamp="2025-10-08 22:45:00 +0000 UTC" firstStartedPulling="2025-10-08 22:45:01.591775632 +0000 UTC m=+1309.414660378" lastFinishedPulling="2025-10-08 22:45:05.258365359 +0000 UTC m=+1313.081250105" observedRunningTime="2025-10-08 22:45:06.839904751 +0000 UTC m=+1314.662789497" watchObservedRunningTime="2025-10-08 22:45:06.84192023 +0000 UTC m=+1314.664804986" Oct 08 22:45:06 crc kubenswrapper[4834]: W1008 22:45:06.979912 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d068c06_21ea_48dc_a816_8c6c7e21d3d6.slice/crio-b50c46e19709799f9cb6e1a3d79d9f8b25d7e1d9da93f393e4220d0becd2b860 WatchSource:0}: Error finding container b50c46e19709799f9cb6e1a3d79d9f8b25d7e1d9da93f393e4220d0becd2b860: Status 404 returned error can't find the container with id b50c46e19709799f9cb6e1a3d79d9f8b25d7e1d9da93f393e4220d0becd2b860 Oct 08 22:45:07 crc kubenswrapper[4834]: I1008 22:45:07.005825 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:45:07 crc kubenswrapper[4834]: W1008 22:45:07.428019 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d347a09_693b_4c37_8a0c_5143e16fd9f8.slice/crio-134df7f310933a970a75b8e3ceea67dece21330f673b989e4fcc8f32811a0fd7 WatchSource:0}: Error finding container 134df7f310933a970a75b8e3ceea67dece21330f673b989e4fcc8f32811a0fd7: Status 404 returned error can't find the container with id 134df7f310933a970a75b8e3ceea67dece21330f673b989e4fcc8f32811a0fd7 Oct 08 22:45:07 crc kubenswrapper[4834]: I1008 22:45:07.428707 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dl62z"] Oct 08 22:45:07 crc kubenswrapper[4834]: I1008 22:45:07.825395 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dl62z" event={"ID":"4d347a09-693b-4c37-8a0c-5143e16fd9f8","Type":"ContainerStarted","Data":"2374fa95bd5da7dd8560e90d3040135b0604e7518dabea482807d6d2761dc576"} Oct 08 22:45:07 crc kubenswrapper[4834]: I1008 22:45:07.825453 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dl62z" event={"ID":"4d347a09-693b-4c37-8a0c-5143e16fd9f8","Type":"ContainerStarted","Data":"134df7f310933a970a75b8e3ceea67dece21330f673b989e4fcc8f32811a0fd7"} Oct 08 22:45:07 crc kubenswrapper[4834]: I1008 22:45:07.828561 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d068c06-21ea-48dc-a816-8c6c7e21d3d6","Type":"ContainerStarted","Data":"b50c46e19709799f9cb6e1a3d79d9f8b25d7e1d9da93f393e4220d0becd2b860"} Oct 08 22:45:07 crc kubenswrapper[4834]: I1008 22:45:07.829884 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7fe0249-e25b-4912-b569-d8b16d8da682","Type":"ContainerStarted","Data":"b5b3bc723da654909f550545019cb5a28d1a085d15135bed45d0fad921824391"} Oct 08 22:45:07 crc kubenswrapper[4834]: I1008 22:45:07.834940 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rx46c" event={"ID":"c708dd7c-f12f-49bf-a622-74b33227c62f","Type":"ContainerStarted","Data":"056ba376def004f5c203b607fe214be5d15dfb07d6766619f892b183abcc6853"} Oct 08 22:45:07 crc kubenswrapper[4834]: I1008 22:45:07.841062 4834 generic.go:334] "Generic (PLEG): container finished" podID="7cbed812-c266-46ad-9b64-ccb83e6efb76" containerID="67cba6b2e6fb5760abc4a1fed36847d10ef3d9623587e8d7c3c492e7f0b8bcbb" exitCode=0 Oct 08 22:45:07 crc kubenswrapper[4834]: I1008 22:45:07.842317 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" event={"ID":"7cbed812-c266-46ad-9b64-ccb83e6efb76","Type":"ContainerDied","Data":"67cba6b2e6fb5760abc4a1fed36847d10ef3d9623587e8d7c3c492e7f0b8bcbb"} Oct 08 22:45:07 crc kubenswrapper[4834]: I1008 22:45:07.842346 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" event={"ID":"7cbed812-c266-46ad-9b64-ccb83e6efb76","Type":"ContainerStarted","Data":"388385b0825a99d1b0ca0e34f609dadc874f098b3adf29cc150b6478018abd70"} Oct 08 22:45:07 crc kubenswrapper[4834]: I1008 22:45:07.857071 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-dl62z" podStartSLOduration=2.857051803 podStartE2EDuration="2.857051803s" podCreationTimestamp="2025-10-08 22:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:45:07.840340568 +0000 UTC m=+1315.663225304" watchObservedRunningTime="2025-10-08 22:45:07.857051803 +0000 UTC m=+1315.679936549" Oct 08 22:45:07 crc kubenswrapper[4834]: I1008 22:45:07.867069 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-rx46c" podStartSLOduration=3.867050216 podStartE2EDuration="3.867050216s" podCreationTimestamp="2025-10-08 22:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:45:07.865759884 +0000 UTC m=+1315.688644630" watchObservedRunningTime="2025-10-08 22:45:07.867050216 +0000 UTC m=+1315.689934962" Oct 08 22:45:08 crc kubenswrapper[4834]: I1008 22:45:08.665689 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:45:08 crc kubenswrapper[4834]: I1008 22:45:08.686520 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:08 crc kubenswrapper[4834]: I1008 22:45:08.856661 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" event={"ID":"7cbed812-c266-46ad-9b64-ccb83e6efb76","Type":"ContainerStarted","Data":"17651b84fca04f45f755cc82539e7ab66b0bf27627a435857700c2da43a23544"} Oct 08 22:45:09 crc kubenswrapper[4834]: I1008 22:45:09.865954 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:09 crc kubenswrapper[4834]: I1008 22:45:09.891673 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" podStartSLOduration=4.891655085 podStartE2EDuration="4.891655085s" podCreationTimestamp="2025-10-08 22:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:45:09.884782148 +0000 UTC m=+1317.707666894" watchObservedRunningTime="2025-10-08 22:45:09.891655085 +0000 UTC m=+1317.714539831" Oct 08 22:45:12 crc kubenswrapper[4834]: I1008 22:45:12.898736 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7fe0249-e25b-4912-b569-d8b16d8da682","Type":"ContainerStarted","Data":"50165228f76efb3fa92eaca441d86781461722ecf9c41845ad36a3b89db2c064"} Oct 08 22:45:12 crc kubenswrapper[4834]: I1008 22:45:12.898836 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b7fe0249-e25b-4912-b569-d8b16d8da682" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://50165228f76efb3fa92eaca441d86781461722ecf9c41845ad36a3b89db2c064" gracePeriod=30 Oct 08 22:45:12 crc kubenswrapper[4834]: I1008 22:45:12.901216 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14fc6262-b8ce-4cd4-927f-b78590123bd9","Type":"ContainerStarted","Data":"45c7fca60ecb1628b799cfb660b7c483495d50fb85a855c6659c32c2439ce7c7"} Oct 08 22:45:12 crc kubenswrapper[4834]: I1008 22:45:12.901254 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14fc6262-b8ce-4cd4-927f-b78590123bd9","Type":"ContainerStarted","Data":"33bda31da78488e9ed5d034251e043790a436af05147cf8111d934680ab121e7"} Oct 08 22:45:12 crc kubenswrapper[4834]: I1008 22:45:12.905507 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d068c06-21ea-48dc-a816-8c6c7e21d3d6","Type":"ContainerStarted","Data":"2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda"} Oct 08 22:45:12 crc kubenswrapper[4834]: I1008 22:45:12.905536 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d068c06-21ea-48dc-a816-8c6c7e21d3d6","Type":"ContainerStarted","Data":"3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd"} Oct 08 22:45:12 crc kubenswrapper[4834]: I1008 22:45:12.905634 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9d068c06-21ea-48dc-a816-8c6c7e21d3d6" containerName="nova-metadata-log" containerID="cri-o://3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd" gracePeriod=30 Oct 08 22:45:12 crc kubenswrapper[4834]: I1008 22:45:12.905692 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9d068c06-21ea-48dc-a816-8c6c7e21d3d6" containerName="nova-metadata-metadata" containerID="cri-o://2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda" gracePeriod=30 Oct 08 22:45:12 crc kubenswrapper[4834]: I1008 22:45:12.908222 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb961381-3f4f-4192-b7de-098fd403ff53","Type":"ContainerStarted","Data":"7837c8d21aa41743a71f58d0fbf49f573ad560563cdbc8f17659ba51538d7935"} Oct 08 22:45:12 crc kubenswrapper[4834]: I1008 22:45:12.930881 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.056327257 podStartE2EDuration="7.930861964s" podCreationTimestamp="2025-10-08 22:45:05 +0000 UTC" firstStartedPulling="2025-10-08 22:45:07.020802339 +0000 UTC m=+1314.843687085" lastFinishedPulling="2025-10-08 22:45:11.895337016 +0000 UTC m=+1319.718221792" observedRunningTime="2025-10-08 22:45:12.920296678 +0000 UTC m=+1320.743181424" watchObservedRunningTime="2025-10-08 22:45:12.930861964 +0000 UTC m=+1320.753746720" Oct 08 22:45:12 crc kubenswrapper[4834]: I1008 22:45:12.941903 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.01565328 podStartE2EDuration="7.941882781s" podCreationTimestamp="2025-10-08 22:45:05 +0000 UTC" firstStartedPulling="2025-10-08 22:45:06.991514599 +0000 UTC m=+1314.814399385" lastFinishedPulling="2025-10-08 22:45:11.91774413 +0000 UTC m=+1319.740628886" observedRunningTime="2025-10-08 22:45:12.936937271 +0000 UTC m=+1320.759822027" watchObservedRunningTime="2025-10-08 22:45:12.941882781 +0000 UTC m=+1320.764767527" Oct 08 22:45:12 crc kubenswrapper[4834]: I1008 22:45:12.963663 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.129936466 podStartE2EDuration="8.963642999s" podCreationTimestamp="2025-10-08 22:45:04 +0000 UTC" firstStartedPulling="2025-10-08 22:45:06.063829667 +0000 UTC m=+1313.886714413" lastFinishedPulling="2025-10-08 22:45:11.89753618 +0000 UTC m=+1319.720420946" observedRunningTime="2025-10-08 22:45:12.956281521 +0000 UTC m=+1320.779166277" watchObservedRunningTime="2025-10-08 22:45:12.963642999 +0000 UTC m=+1320.786527745" Oct 08 22:45:12 crc kubenswrapper[4834]: I1008 22:45:12.976584 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.823875652 podStartE2EDuration="8.976567152s" podCreationTimestamp="2025-10-08 22:45:04 +0000 UTC" firstStartedPulling="2025-10-08 22:45:05.767677814 +0000 UTC m=+1313.590562550" lastFinishedPulling="2025-10-08 22:45:11.920369294 +0000 UTC m=+1319.743254050" observedRunningTime="2025-10-08 22:45:12.973095879 +0000 UTC m=+1320.795980625" watchObservedRunningTime="2025-10-08 22:45:12.976567152 +0000 UTC m=+1320.799451898" Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.539347 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.598581 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-combined-ca-bundle\") pod \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.598725 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ldns\" (UniqueName: \"kubernetes.io/projected/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-kube-api-access-6ldns\") pod \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.598804 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-config-data\") pod \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.598840 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-logs\") pod \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\" (UID: \"9d068c06-21ea-48dc-a816-8c6c7e21d3d6\") " Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.599438 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-logs" (OuterVolumeSpecName: "logs") pod "9d068c06-21ea-48dc-a816-8c6c7e21d3d6" (UID: "9d068c06-21ea-48dc-a816-8c6c7e21d3d6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.607163 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-kube-api-access-6ldns" (OuterVolumeSpecName: "kube-api-access-6ldns") pod "9d068c06-21ea-48dc-a816-8c6c7e21d3d6" (UID: "9d068c06-21ea-48dc-a816-8c6c7e21d3d6"). InnerVolumeSpecName "kube-api-access-6ldns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.638680 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d068c06-21ea-48dc-a816-8c6c7e21d3d6" (UID: "9d068c06-21ea-48dc-a816-8c6c7e21d3d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.639031 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-config-data" (OuterVolumeSpecName: "config-data") pod "9d068c06-21ea-48dc-a816-8c6c7e21d3d6" (UID: "9d068c06-21ea-48dc-a816-8c6c7e21d3d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.701475 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ldns\" (UniqueName: \"kubernetes.io/projected/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-kube-api-access-6ldns\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.701507 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.701519 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.701528 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d068c06-21ea-48dc-a816-8c6c7e21d3d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.931502 4834 generic.go:334] "Generic (PLEG): container finished" podID="9d068c06-21ea-48dc-a816-8c6c7e21d3d6" containerID="2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda" exitCode=0 Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.931551 4834 generic.go:334] "Generic (PLEG): container finished" podID="9d068c06-21ea-48dc-a816-8c6c7e21d3d6" containerID="3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd" exitCode=143 Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.932709 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.942337 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d068c06-21ea-48dc-a816-8c6c7e21d3d6","Type":"ContainerDied","Data":"2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda"} Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.942426 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d068c06-21ea-48dc-a816-8c6c7e21d3d6","Type":"ContainerDied","Data":"3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd"} Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.942479 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d068c06-21ea-48dc-a816-8c6c7e21d3d6","Type":"ContainerDied","Data":"b50c46e19709799f9cb6e1a3d79d9f8b25d7e1d9da93f393e4220d0becd2b860"} Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.942516 4834 scope.go:117] "RemoveContainer" containerID="2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda" Oct 08 22:45:13 crc kubenswrapper[4834]: I1008 22:45:13.994005 4834 scope.go:117] "RemoveContainer" containerID="3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.002843 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.016849 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.025704 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:14 crc kubenswrapper[4834]: E1008 22:45:14.026384 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d068c06-21ea-48dc-a816-8c6c7e21d3d6" containerName="nova-metadata-metadata" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.026404 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d068c06-21ea-48dc-a816-8c6c7e21d3d6" containerName="nova-metadata-metadata" Oct 08 22:45:14 crc kubenswrapper[4834]: E1008 22:45:14.026415 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d068c06-21ea-48dc-a816-8c6c7e21d3d6" containerName="nova-metadata-log" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.026486 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d068c06-21ea-48dc-a816-8c6c7e21d3d6" containerName="nova-metadata-log" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.026907 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d068c06-21ea-48dc-a816-8c6c7e21d3d6" containerName="nova-metadata-log" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.026938 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d068c06-21ea-48dc-a816-8c6c7e21d3d6" containerName="nova-metadata-metadata" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.028423 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.055895 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.067724 4834 scope.go:117] "RemoveContainer" containerID="2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.068275 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 08 22:45:14 crc kubenswrapper[4834]: E1008 22:45:14.068465 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda\": container with ID starting with 2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda not found: ID does not exist" containerID="2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.068497 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda"} err="failed to get container status \"2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda\": rpc error: code = NotFound desc = could not find container \"2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda\": container with ID starting with 2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda not found: ID does not exist" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.068522 4834 scope.go:117] "RemoveContainer" containerID="3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.068642 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 22:45:14 crc kubenswrapper[4834]: E1008 22:45:14.069302 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd\": container with ID starting with 3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd not found: ID does not exist" containerID="3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.069335 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd"} err="failed to get container status \"3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd\": rpc error: code = NotFound desc = could not find container \"3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd\": container with ID starting with 3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd not found: ID does not exist" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.069358 4834 scope.go:117] "RemoveContainer" containerID="2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.069658 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda"} err="failed to get container status \"2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda\": rpc error: code = NotFound desc = could not find container \"2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda\": container with ID starting with 2ddaca7e008361fdeecc98165f4d53aacc15278c32d157437d8ad281e3cbabda not found: ID does not exist" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.069677 4834 scope.go:117] "RemoveContainer" containerID="3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.069877 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd"} err="failed to get container status \"3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd\": rpc error: code = NotFound desc = could not find container \"3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd\": container with ID starting with 3af96b17de1f14ccd1a438efe99a12a661d3577fe8273d804cfd199c7077d6bd not found: ID does not exist" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.110307 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5mhp\" (UniqueName: \"kubernetes.io/projected/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-kube-api-access-x5mhp\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.110429 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-logs\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.110491 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-config-data\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.110539 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.110564 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.212511 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5mhp\" (UniqueName: \"kubernetes.io/projected/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-kube-api-access-x5mhp\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.212645 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-logs\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.212704 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-config-data\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.212735 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.212752 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.213771 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-logs\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.219919 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.220568 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-config-data\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.236696 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.244729 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5mhp\" (UniqueName: \"kubernetes.io/projected/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-kube-api-access-x5mhp\") pod \"nova-metadata-0\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.386569 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.850909 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.943299 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21a7b6e7-8a8f-402c-b20e-8cd7cc637867","Type":"ContainerStarted","Data":"7db068d40268b98dc315ac926531f83ed6df1096c70cbf20d8c29377a4a8737b"} Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.947928 4834 generic.go:334] "Generic (PLEG): container finished" podID="c708dd7c-f12f-49bf-a622-74b33227c62f" containerID="056ba376def004f5c203b607fe214be5d15dfb07d6766619f892b183abcc6853" exitCode=0 Oct 08 22:45:14 crc kubenswrapper[4834]: I1008 22:45:14.947956 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rx46c" event={"ID":"c708dd7c-f12f-49bf-a622-74b33227c62f","Type":"ContainerDied","Data":"056ba376def004f5c203b607fe214be5d15dfb07d6766619f892b183abcc6853"} Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.307182 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.307592 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.420549 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.420618 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.455079 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.568895 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d068c06-21ea-48dc-a816-8c6c7e21d3d6" path="/var/lib/kubelet/pods/9d068c06-21ea-48dc-a816-8c6c7e21d3d6/volumes" Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.642942 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.681358 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.787905 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59b9656b65-lm9hg"] Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.790300 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" podUID="b67f945d-dab3-4ece-9627-d5891da263ae" containerName="dnsmasq-dns" containerID="cri-o://3599314830d6c060085be33d2da4be7769553f44c3fa06c55dee8c8207a178bd" gracePeriod=10 Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.966985 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21a7b6e7-8a8f-402c-b20e-8cd7cc637867","Type":"ContainerStarted","Data":"f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d"} Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.967029 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21a7b6e7-8a8f-402c-b20e-8cd7cc637867","Type":"ContainerStarted","Data":"481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a"} Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.968797 4834 generic.go:334] "Generic (PLEG): container finished" podID="b67f945d-dab3-4ece-9627-d5891da263ae" containerID="3599314830d6c060085be33d2da4be7769553f44c3fa06c55dee8c8207a178bd" exitCode=0 Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.968977 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" event={"ID":"b67f945d-dab3-4ece-9627-d5891da263ae","Type":"ContainerDied","Data":"3599314830d6c060085be33d2da4be7769553f44c3fa06c55dee8c8207a178bd"} Oct 08 22:45:15 crc kubenswrapper[4834]: I1008 22:45:15.986649 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9866328549999999 podStartE2EDuration="1.986632855s" podCreationTimestamp="2025-10-08 22:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:45:15.986581934 +0000 UTC m=+1323.809466680" watchObservedRunningTime="2025-10-08 22:45:15.986632855 +0000 UTC m=+1323.809517591" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.015616 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.305554 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.391311 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="14fc6262-b8ce-4cd4-927f-b78590123bd9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.391516 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="14fc6262-b8ce-4cd4-927f-b78590123bd9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.400676 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.473287 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-combined-ca-bundle\") pod \"c708dd7c-f12f-49bf-a622-74b33227c62f\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.474005 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-config-data\") pod \"c708dd7c-f12f-49bf-a622-74b33227c62f\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.474034 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-ovsdbserver-sb\") pod \"b67f945d-dab3-4ece-9627-d5891da263ae\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.474066 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-dns-svc\") pod \"b67f945d-dab3-4ece-9627-d5891da263ae\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.474083 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtgdx\" (UniqueName: \"kubernetes.io/projected/b67f945d-dab3-4ece-9627-d5891da263ae-kube-api-access-gtgdx\") pod \"b67f945d-dab3-4ece-9627-d5891da263ae\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.474130 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-scripts\") pod \"c708dd7c-f12f-49bf-a622-74b33227c62f\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.474236 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-ovsdbserver-nb\") pod \"b67f945d-dab3-4ece-9627-d5891da263ae\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.474293 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-config\") pod \"b67f945d-dab3-4ece-9627-d5891da263ae\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.474318 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8j44\" (UniqueName: \"kubernetes.io/projected/c708dd7c-f12f-49bf-a622-74b33227c62f-kube-api-access-q8j44\") pod \"c708dd7c-f12f-49bf-a622-74b33227c62f\" (UID: \"c708dd7c-f12f-49bf-a622-74b33227c62f\") " Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.474341 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-dns-swift-storage-0\") pod \"b67f945d-dab3-4ece-9627-d5891da263ae\" (UID: \"b67f945d-dab3-4ece-9627-d5891da263ae\") " Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.479663 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b67f945d-dab3-4ece-9627-d5891da263ae-kube-api-access-gtgdx" (OuterVolumeSpecName: "kube-api-access-gtgdx") pod "b67f945d-dab3-4ece-9627-d5891da263ae" (UID: "b67f945d-dab3-4ece-9627-d5891da263ae"). InnerVolumeSpecName "kube-api-access-gtgdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.479914 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-scripts" (OuterVolumeSpecName: "scripts") pod "c708dd7c-f12f-49bf-a622-74b33227c62f" (UID: "c708dd7c-f12f-49bf-a622-74b33227c62f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.483015 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c708dd7c-f12f-49bf-a622-74b33227c62f-kube-api-access-q8j44" (OuterVolumeSpecName: "kube-api-access-q8j44") pod "c708dd7c-f12f-49bf-a622-74b33227c62f" (UID: "c708dd7c-f12f-49bf-a622-74b33227c62f"). InnerVolumeSpecName "kube-api-access-q8j44". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.502543 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c708dd7c-f12f-49bf-a622-74b33227c62f" (UID: "c708dd7c-f12f-49bf-a622-74b33227c62f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.505023 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-config-data" (OuterVolumeSpecName: "config-data") pod "c708dd7c-f12f-49bf-a622-74b33227c62f" (UID: "c708dd7c-f12f-49bf-a622-74b33227c62f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.522542 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b67f945d-dab3-4ece-9627-d5891da263ae" (UID: "b67f945d-dab3-4ece-9627-d5891da263ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.523743 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b67f945d-dab3-4ece-9627-d5891da263ae" (UID: "b67f945d-dab3-4ece-9627-d5891da263ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.532689 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b67f945d-dab3-4ece-9627-d5891da263ae" (UID: "b67f945d-dab3-4ece-9627-d5891da263ae"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.536784 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-config" (OuterVolumeSpecName: "config") pod "b67f945d-dab3-4ece-9627-d5891da263ae" (UID: "b67f945d-dab3-4ece-9627-d5891da263ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.538770 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b67f945d-dab3-4ece-9627-d5891da263ae" (UID: "b67f945d-dab3-4ece-9627-d5891da263ae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.575920 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.575954 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8j44\" (UniqueName: \"kubernetes.io/projected/c708dd7c-f12f-49bf-a622-74b33227c62f-kube-api-access-q8j44\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.575967 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.575976 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.575986 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.575995 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.576004 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.576013 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtgdx\" (UniqueName: \"kubernetes.io/projected/b67f945d-dab3-4ece-9627-d5891da263ae-kube-api-access-gtgdx\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.576021 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c708dd7c-f12f-49bf-a622-74b33227c62f-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.576028 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b67f945d-dab3-4ece-9627-d5891da263ae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.981069 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rx46c" event={"ID":"c708dd7c-f12f-49bf-a622-74b33227c62f","Type":"ContainerDied","Data":"88b0573350e6ee588acc06ceb59b0e3a08b66ae8628f405a308c2807ec31d4eb"} Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.981367 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88b0573350e6ee588acc06ceb59b0e3a08b66ae8628f405a308c2807ec31d4eb" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.981168 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rx46c" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.983509 4834 generic.go:334] "Generic (PLEG): container finished" podID="4d347a09-693b-4c37-8a0c-5143e16fd9f8" containerID="2374fa95bd5da7dd8560e90d3040135b0604e7518dabea482807d6d2761dc576" exitCode=0 Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.983579 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dl62z" event={"ID":"4d347a09-693b-4c37-8a0c-5143e16fd9f8","Type":"ContainerDied","Data":"2374fa95bd5da7dd8560e90d3040135b0604e7518dabea482807d6d2761dc576"} Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.986285 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" event={"ID":"b67f945d-dab3-4ece-9627-d5891da263ae","Type":"ContainerDied","Data":"4bf18755698eadac6a0b6a8c4d8cf1320b962288741f68254b85a65d7b63f371"} Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.986364 4834 scope.go:117] "RemoveContainer" containerID="3599314830d6c060085be33d2da4be7769553f44c3fa06c55dee8c8207a178bd" Oct 08 22:45:16 crc kubenswrapper[4834]: I1008 22:45:16.986540 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59b9656b65-lm9hg" Oct 08 22:45:17 crc kubenswrapper[4834]: I1008 22:45:17.027588 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:45:17 crc kubenswrapper[4834]: I1008 22:45:17.027665 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:45:17 crc kubenswrapper[4834]: I1008 22:45:17.028574 4834 scope.go:117] "RemoveContainer" containerID="10dec061dba544e1b7211a31d01da7089008ca2b300d4f403582aec78220e5f6" Oct 08 22:45:17 crc kubenswrapper[4834]: I1008 22:45:17.086890 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59b9656b65-lm9hg"] Oct 08 22:45:17 crc kubenswrapper[4834]: I1008 22:45:17.116096 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59b9656b65-lm9hg"] Oct 08 22:45:17 crc kubenswrapper[4834]: I1008 22:45:17.191070 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:45:17 crc kubenswrapper[4834]: I1008 22:45:17.211428 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:17 crc kubenswrapper[4834]: I1008 22:45:17.211637 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="14fc6262-b8ce-4cd4-927f-b78590123bd9" containerName="nova-api-log" containerID="cri-o://33bda31da78488e9ed5d034251e043790a436af05147cf8111d934680ab121e7" gracePeriod=30 Oct 08 22:45:17 crc kubenswrapper[4834]: I1008 22:45:17.211943 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="14fc6262-b8ce-4cd4-927f-b78590123bd9" containerName="nova-api-api" containerID="cri-o://45c7fca60ecb1628b799cfb660b7c483495d50fb85a855c6659c32c2439ce7c7" gracePeriod=30 Oct 08 22:45:17 crc kubenswrapper[4834]: I1008 22:45:17.251528 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:17 crc kubenswrapper[4834]: I1008 22:45:17.570790 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b67f945d-dab3-4ece-9627-d5891da263ae" path="/var/lib/kubelet/pods/b67f945d-dab3-4ece-9627-d5891da263ae/volumes" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.022251 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14fc6262-b8ce-4cd4-927f-b78590123bd9","Type":"ContainerDied","Data":"33bda31da78488e9ed5d034251e043790a436af05147cf8111d934680ab121e7"} Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.022306 4834 generic.go:334] "Generic (PLEG): container finished" podID="14fc6262-b8ce-4cd4-927f-b78590123bd9" containerID="33bda31da78488e9ed5d034251e043790a436af05147cf8111d934680ab121e7" exitCode=143 Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.023114 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="21a7b6e7-8a8f-402c-b20e-8cd7cc637867" containerName="nova-metadata-log" containerID="cri-o://481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a" gracePeriod=30 Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.023340 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="21a7b6e7-8a8f-402c-b20e-8cd7cc637867" containerName="nova-metadata-metadata" containerID="cri-o://f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d" gracePeriod=30 Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.023688 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="cb961381-3f4f-4192-b7de-098fd403ff53" containerName="nova-scheduler-scheduler" containerID="cri-o://7837c8d21aa41743a71f58d0fbf49f573ad560563cdbc8f17659ba51538d7935" gracePeriod=30 Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.425078 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.523874 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-scripts\") pod \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.523956 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ctnd\" (UniqueName: \"kubernetes.io/projected/4d347a09-693b-4c37-8a0c-5143e16fd9f8-kube-api-access-9ctnd\") pod \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.524031 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-config-data\") pod \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.524130 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-combined-ca-bundle\") pod \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\" (UID: \"4d347a09-693b-4c37-8a0c-5143e16fd9f8\") " Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.529322 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d347a09-693b-4c37-8a0c-5143e16fd9f8-kube-api-access-9ctnd" (OuterVolumeSpecName: "kube-api-access-9ctnd") pod "4d347a09-693b-4c37-8a0c-5143e16fd9f8" (UID: "4d347a09-693b-4c37-8a0c-5143e16fd9f8"). InnerVolumeSpecName "kube-api-access-9ctnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.529414 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-scripts" (OuterVolumeSpecName: "scripts") pod "4d347a09-693b-4c37-8a0c-5143e16fd9f8" (UID: "4d347a09-693b-4c37-8a0c-5143e16fd9f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.550248 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-config-data" (OuterVolumeSpecName: "config-data") pod "4d347a09-693b-4c37-8a0c-5143e16fd9f8" (UID: "4d347a09-693b-4c37-8a0c-5143e16fd9f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.556743 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d347a09-693b-4c37-8a0c-5143e16fd9f8" (UID: "4d347a09-693b-4c37-8a0c-5143e16fd9f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.587293 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.626507 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.626574 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ctnd\" (UniqueName: \"kubernetes.io/projected/4d347a09-693b-4c37-8a0c-5143e16fd9f8-kube-api-access-9ctnd\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.626595 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.626612 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d347a09-693b-4c37-8a0c-5143e16fd9f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.727959 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-nova-metadata-tls-certs\") pod \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.728097 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5mhp\" (UniqueName: \"kubernetes.io/projected/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-kube-api-access-x5mhp\") pod \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.728202 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-logs\") pod \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.728228 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-config-data\") pod \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.728359 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-combined-ca-bundle\") pod \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\" (UID: \"21a7b6e7-8a8f-402c-b20e-8cd7cc637867\") " Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.728801 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-logs" (OuterVolumeSpecName: "logs") pod "21a7b6e7-8a8f-402c-b20e-8cd7cc637867" (UID: "21a7b6e7-8a8f-402c-b20e-8cd7cc637867"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.733520 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-kube-api-access-x5mhp" (OuterVolumeSpecName: "kube-api-access-x5mhp") pod "21a7b6e7-8a8f-402c-b20e-8cd7cc637867" (UID: "21a7b6e7-8a8f-402c-b20e-8cd7cc637867"). InnerVolumeSpecName "kube-api-access-x5mhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.775207 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "21a7b6e7-8a8f-402c-b20e-8cd7cc637867" (UID: "21a7b6e7-8a8f-402c-b20e-8cd7cc637867"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.777120 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21a7b6e7-8a8f-402c-b20e-8cd7cc637867" (UID: "21a7b6e7-8a8f-402c-b20e-8cd7cc637867"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.779997 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-config-data" (OuterVolumeSpecName: "config-data") pod "21a7b6e7-8a8f-402c-b20e-8cd7cc637867" (UID: "21a7b6e7-8a8f-402c-b20e-8cd7cc637867"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.830831 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.830876 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.830890 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.830905 4834 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:18 crc kubenswrapper[4834]: I1008 22:45:18.830917 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5mhp\" (UniqueName: \"kubernetes.io/projected/21a7b6e7-8a8f-402c-b20e-8cd7cc637867-kube-api-access-x5mhp\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.041910 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dl62z" event={"ID":"4d347a09-693b-4c37-8a0c-5143e16fd9f8","Type":"ContainerDied","Data":"134df7f310933a970a75b8e3ceea67dece21330f673b989e4fcc8f32811a0fd7"} Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.042430 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="134df7f310933a970a75b8e3ceea67dece21330f673b989e4fcc8f32811a0fd7" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.042697 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dl62z" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.045445 4834 generic.go:334] "Generic (PLEG): container finished" podID="21a7b6e7-8a8f-402c-b20e-8cd7cc637867" containerID="f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d" exitCode=0 Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.045499 4834 generic.go:334] "Generic (PLEG): container finished" podID="21a7b6e7-8a8f-402c-b20e-8cd7cc637867" containerID="481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a" exitCode=143 Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.045523 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21a7b6e7-8a8f-402c-b20e-8cd7cc637867","Type":"ContainerDied","Data":"f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d"} Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.045772 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.045827 4834 scope.go:117] "RemoveContainer" containerID="f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.045811 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21a7b6e7-8a8f-402c-b20e-8cd7cc637867","Type":"ContainerDied","Data":"481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a"} Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.045970 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21a7b6e7-8a8f-402c-b20e-8cd7cc637867","Type":"ContainerDied","Data":"7db068d40268b98dc315ac926531f83ed6df1096c70cbf20d8c29377a4a8737b"} Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.101391 4834 scope.go:117] "RemoveContainer" containerID="481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.108090 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 22:45:19 crc kubenswrapper[4834]: E1008 22:45:19.108580 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21a7b6e7-8a8f-402c-b20e-8cd7cc637867" containerName="nova-metadata-metadata" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.108598 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="21a7b6e7-8a8f-402c-b20e-8cd7cc637867" containerName="nova-metadata-metadata" Oct 08 22:45:19 crc kubenswrapper[4834]: E1008 22:45:19.108619 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21a7b6e7-8a8f-402c-b20e-8cd7cc637867" containerName="nova-metadata-log" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.108627 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="21a7b6e7-8a8f-402c-b20e-8cd7cc637867" containerName="nova-metadata-log" Oct 08 22:45:19 crc kubenswrapper[4834]: E1008 22:45:19.108649 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b67f945d-dab3-4ece-9627-d5891da263ae" containerName="dnsmasq-dns" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.108657 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67f945d-dab3-4ece-9627-d5891da263ae" containerName="dnsmasq-dns" Oct 08 22:45:19 crc kubenswrapper[4834]: E1008 22:45:19.108680 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b67f945d-dab3-4ece-9627-d5891da263ae" containerName="init" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.108688 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67f945d-dab3-4ece-9627-d5891da263ae" containerName="init" Oct 08 22:45:19 crc kubenswrapper[4834]: E1008 22:45:19.108703 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c708dd7c-f12f-49bf-a622-74b33227c62f" containerName="nova-manage" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.108710 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c708dd7c-f12f-49bf-a622-74b33227c62f" containerName="nova-manage" Oct 08 22:45:19 crc kubenswrapper[4834]: E1008 22:45:19.108730 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d347a09-693b-4c37-8a0c-5143e16fd9f8" containerName="nova-cell1-conductor-db-sync" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.108738 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d347a09-693b-4c37-8a0c-5143e16fd9f8" containerName="nova-cell1-conductor-db-sync" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.108943 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="21a7b6e7-8a8f-402c-b20e-8cd7cc637867" containerName="nova-metadata-log" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.108967 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b67f945d-dab3-4ece-9627-d5891da263ae" containerName="dnsmasq-dns" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.108984 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c708dd7c-f12f-49bf-a622-74b33227c62f" containerName="nova-manage" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.108995 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d347a09-693b-4c37-8a0c-5143e16fd9f8" containerName="nova-cell1-conductor-db-sync" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.109006 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="21a7b6e7-8a8f-402c-b20e-8cd7cc637867" containerName="nova-metadata-metadata" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.109726 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.117700 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.154602 4834 scope.go:117] "RemoveContainer" containerID="f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.157752 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:19 crc kubenswrapper[4834]: E1008 22:45:19.157977 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d\": container with ID starting with f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d not found: ID does not exist" containerID="f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.160720 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d"} err="failed to get container status \"f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d\": rpc error: code = NotFound desc = could not find container \"f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d\": container with ID starting with f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d not found: ID does not exist" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.160748 4834 scope.go:117] "RemoveContainer" containerID="481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a" Oct 08 22:45:19 crc kubenswrapper[4834]: E1008 22:45:19.161343 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a\": container with ID starting with 481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a not found: ID does not exist" containerID="481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.161384 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a"} err="failed to get container status \"481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a\": rpc error: code = NotFound desc = could not find container \"481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a\": container with ID starting with 481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a not found: ID does not exist" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.161519 4834 scope.go:117] "RemoveContainer" containerID="f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.163500 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d"} err="failed to get container status \"f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d\": rpc error: code = NotFound desc = could not find container \"f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d\": container with ID starting with f4b9e199a9e53019c332a87c82c4193040cfe41fe09845df3dbcc259f856dc7d not found: ID does not exist" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.163538 4834 scope.go:117] "RemoveContainer" containerID="481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.164193 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a"} err="failed to get container status \"481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a\": rpc error: code = NotFound desc = could not find container \"481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a\": container with ID starting with 481783702f6321724811c0dbd1abb0268092065f216c1366caa938cb50a9121a not found: ID does not exist" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.172699 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.181524 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.199331 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.201040 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.203608 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.203915 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.210706 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.254900 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92213f20-28bf-4fe1-b547-6867677b0049-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"92213f20-28bf-4fe1-b547-6867677b0049\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.254976 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlfjg\" (UniqueName: \"kubernetes.io/projected/92213f20-28bf-4fe1-b547-6867677b0049-kube-api-access-dlfjg\") pod \"nova-cell1-conductor-0\" (UID: \"92213f20-28bf-4fe1-b547-6867677b0049\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.255017 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92213f20-28bf-4fe1-b547-6867677b0049-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"92213f20-28bf-4fe1-b547-6867677b0049\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.356946 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.357399 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e088e5ae-16c2-4a56-a140-3159b429ad55-logs\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.357627 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlfjg\" (UniqueName: \"kubernetes.io/projected/92213f20-28bf-4fe1-b547-6867677b0049-kube-api-access-dlfjg\") pod \"nova-cell1-conductor-0\" (UID: \"92213f20-28bf-4fe1-b547-6867677b0049\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.357801 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92213f20-28bf-4fe1-b547-6867677b0049-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"92213f20-28bf-4fe1-b547-6867677b0049\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.358589 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.358735 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vnps\" (UniqueName: \"kubernetes.io/projected/e088e5ae-16c2-4a56-a140-3159b429ad55-kube-api-access-8vnps\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.358931 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-config-data\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.359084 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92213f20-28bf-4fe1-b547-6867677b0049-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"92213f20-28bf-4fe1-b547-6867677b0049\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.363132 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92213f20-28bf-4fe1-b547-6867677b0049-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"92213f20-28bf-4fe1-b547-6867677b0049\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.365997 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92213f20-28bf-4fe1-b547-6867677b0049-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"92213f20-28bf-4fe1-b547-6867677b0049\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.380003 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlfjg\" (UniqueName: \"kubernetes.io/projected/92213f20-28bf-4fe1-b547-6867677b0049-kube-api-access-dlfjg\") pod \"nova-cell1-conductor-0\" (UID: \"92213f20-28bf-4fe1-b547-6867677b0049\") " pod="openstack/nova-cell1-conductor-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.449884 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.460322 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.460584 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vnps\" (UniqueName: \"kubernetes.io/projected/e088e5ae-16c2-4a56-a140-3159b429ad55-kube-api-access-8vnps\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.460757 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-config-data\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.460881 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.460958 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e088e5ae-16c2-4a56-a140-3159b429ad55-logs\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.461788 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e088e5ae-16c2-4a56-a140-3159b429ad55-logs\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.465609 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.465835 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.466135 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-config-data\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.493042 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vnps\" (UniqueName: \"kubernetes.io/projected/e088e5ae-16c2-4a56-a140-3159b429ad55-kube-api-access-8vnps\") pod \"nova-metadata-0\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.529868 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.573511 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21a7b6e7-8a8f-402c-b20e-8cd7cc637867" path="/var/lib/kubelet/pods/21a7b6e7-8a8f-402c-b20e-8cd7cc637867/volumes" Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.921473 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 22:45:19 crc kubenswrapper[4834]: W1008 22:45:19.928287 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92213f20_28bf_4fe1_b547_6867677b0049.slice/crio-4831f1b98054aff999105ef4a4c7451e953cf981dee2c6797895c5d9555bc81d WatchSource:0}: Error finding container 4831f1b98054aff999105ef4a4c7451e953cf981dee2c6797895c5d9555bc81d: Status 404 returned error can't find the container with id 4831f1b98054aff999105ef4a4c7451e953cf981dee2c6797895c5d9555bc81d Oct 08 22:45:19 crc kubenswrapper[4834]: I1008 22:45:19.993225 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.041873 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.057929 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"92213f20-28bf-4fe1-b547-6867677b0049","Type":"ContainerStarted","Data":"4831f1b98054aff999105ef4a4c7451e953cf981dee2c6797895c5d9555bc81d"} Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.061408 4834 generic.go:334] "Generic (PLEG): container finished" podID="cb961381-3f4f-4192-b7de-098fd403ff53" containerID="7837c8d21aa41743a71f58d0fbf49f573ad560563cdbc8f17659ba51538d7935" exitCode=0 Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.061481 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.061528 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb961381-3f4f-4192-b7de-098fd403ff53","Type":"ContainerDied","Data":"7837c8d21aa41743a71f58d0fbf49f573ad560563cdbc8f17659ba51538d7935"} Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.061584 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb961381-3f4f-4192-b7de-098fd403ff53","Type":"ContainerDied","Data":"b87d21bc1c047a999322c0ee37c0f655b0ebef7def1e032bfd5e41d0da032f20"} Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.061605 4834 scope.go:117] "RemoveContainer" containerID="7837c8d21aa41743a71f58d0fbf49f573ad560563cdbc8f17659ba51538d7935" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.063212 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e088e5ae-16c2-4a56-a140-3159b429ad55","Type":"ContainerStarted","Data":"f026cd3653f38d7102fdd0f68278ebaf57e68c65a9a77ee53f1ec61c4455fd55"} Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.073135 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb961381-3f4f-4192-b7de-098fd403ff53-combined-ca-bundle\") pod \"cb961381-3f4f-4192-b7de-098fd403ff53\" (UID: \"cb961381-3f4f-4192-b7de-098fd403ff53\") " Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.073307 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb961381-3f4f-4192-b7de-098fd403ff53-config-data\") pod \"cb961381-3f4f-4192-b7de-098fd403ff53\" (UID: \"cb961381-3f4f-4192-b7de-098fd403ff53\") " Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.073348 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnvhz\" (UniqueName: \"kubernetes.io/projected/cb961381-3f4f-4192-b7de-098fd403ff53-kube-api-access-bnvhz\") pod \"cb961381-3f4f-4192-b7de-098fd403ff53\" (UID: \"cb961381-3f4f-4192-b7de-098fd403ff53\") " Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.079231 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb961381-3f4f-4192-b7de-098fd403ff53-kube-api-access-bnvhz" (OuterVolumeSpecName: "kube-api-access-bnvhz") pod "cb961381-3f4f-4192-b7de-098fd403ff53" (UID: "cb961381-3f4f-4192-b7de-098fd403ff53"). InnerVolumeSpecName "kube-api-access-bnvhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.087835 4834 scope.go:117] "RemoveContainer" containerID="7837c8d21aa41743a71f58d0fbf49f573ad560563cdbc8f17659ba51538d7935" Oct 08 22:45:20 crc kubenswrapper[4834]: E1008 22:45:20.088441 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7837c8d21aa41743a71f58d0fbf49f573ad560563cdbc8f17659ba51538d7935\": container with ID starting with 7837c8d21aa41743a71f58d0fbf49f573ad560563cdbc8f17659ba51538d7935 not found: ID does not exist" containerID="7837c8d21aa41743a71f58d0fbf49f573ad560563cdbc8f17659ba51538d7935" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.088503 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7837c8d21aa41743a71f58d0fbf49f573ad560563cdbc8f17659ba51538d7935"} err="failed to get container status \"7837c8d21aa41743a71f58d0fbf49f573ad560563cdbc8f17659ba51538d7935\": rpc error: code = NotFound desc = could not find container \"7837c8d21aa41743a71f58d0fbf49f573ad560563cdbc8f17659ba51538d7935\": container with ID starting with 7837c8d21aa41743a71f58d0fbf49f573ad560563cdbc8f17659ba51538d7935 not found: ID does not exist" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.114383 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb961381-3f4f-4192-b7de-098fd403ff53-config-data" (OuterVolumeSpecName: "config-data") pod "cb961381-3f4f-4192-b7de-098fd403ff53" (UID: "cb961381-3f4f-4192-b7de-098fd403ff53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.115487 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb961381-3f4f-4192-b7de-098fd403ff53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb961381-3f4f-4192-b7de-098fd403ff53" (UID: "cb961381-3f4f-4192-b7de-098fd403ff53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.176254 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb961381-3f4f-4192-b7de-098fd403ff53-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.176286 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnvhz\" (UniqueName: \"kubernetes.io/projected/cb961381-3f4f-4192-b7de-098fd403ff53-kube-api-access-bnvhz\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.176295 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb961381-3f4f-4192-b7de-098fd403ff53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.394396 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.404018 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.421463 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:45:20 crc kubenswrapper[4834]: E1008 22:45:20.425130 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb961381-3f4f-4192-b7de-098fd403ff53" containerName="nova-scheduler-scheduler" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.425178 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb961381-3f4f-4192-b7de-098fd403ff53" containerName="nova-scheduler-scheduler" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.425471 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb961381-3f4f-4192-b7de-098fd403ff53" containerName="nova-scheduler-scheduler" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.426068 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.430912 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.432019 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.582453 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a0af76-da3c-4c64-a3d2-12470eb473a5-config-data\") pod \"nova-scheduler-0\" (UID: \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.582607 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a0af76-da3c-4c64-a3d2-12470eb473a5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.582644 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v95r\" (UniqueName: \"kubernetes.io/projected/c2a0af76-da3c-4c64-a3d2-12470eb473a5-kube-api-access-7v95r\") pod \"nova-scheduler-0\" (UID: \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.684239 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a0af76-da3c-4c64-a3d2-12470eb473a5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.684333 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v95r\" (UniqueName: \"kubernetes.io/projected/c2a0af76-da3c-4c64-a3d2-12470eb473a5-kube-api-access-7v95r\") pod \"nova-scheduler-0\" (UID: \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.684577 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a0af76-da3c-4c64-a3d2-12470eb473a5-config-data\") pod \"nova-scheduler-0\" (UID: \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.707418 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a0af76-da3c-4c64-a3d2-12470eb473a5-config-data\") pod \"nova-scheduler-0\" (UID: \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.707681 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a0af76-da3c-4c64-a3d2-12470eb473a5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.712239 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v95r\" (UniqueName: \"kubernetes.io/projected/c2a0af76-da3c-4c64-a3d2-12470eb473a5-kube-api-access-7v95r\") pod \"nova-scheduler-0\" (UID: \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\") " pod="openstack/nova-scheduler-0" Oct 08 22:45:20 crc kubenswrapper[4834]: I1008 22:45:20.764085 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:45:21 crc kubenswrapper[4834]: I1008 22:45:21.076022 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"92213f20-28bf-4fe1-b547-6867677b0049","Type":"ContainerStarted","Data":"16bcb11a1ca18e052f73bc017145f17afd2ff7f2d3c4ac4c7024f6ffa30d6469"} Oct 08 22:45:21 crc kubenswrapper[4834]: I1008 22:45:21.077886 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 08 22:45:21 crc kubenswrapper[4834]: I1008 22:45:21.084882 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e088e5ae-16c2-4a56-a140-3159b429ad55","Type":"ContainerStarted","Data":"034f1ea19ec6256dbb510bd7df11b90a7fe2bc9ae6e6370fafd533fe33931e3e"} Oct 08 22:45:21 crc kubenswrapper[4834]: I1008 22:45:21.084932 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e088e5ae-16c2-4a56-a140-3159b429ad55","Type":"ContainerStarted","Data":"3bd6f7e16a00ae83612e785a5ba08b3c244e2b13197d441be36d452506a7314b"} Oct 08 22:45:21 crc kubenswrapper[4834]: I1008 22:45:21.109712 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.10967076 podStartE2EDuration="2.10967076s" podCreationTimestamp="2025-10-08 22:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:45:21.097285309 +0000 UTC m=+1328.920170065" watchObservedRunningTime="2025-10-08 22:45:21.10967076 +0000 UTC m=+1328.932555516" Oct 08 22:45:21 crc kubenswrapper[4834]: I1008 22:45:21.137577 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.137552356 podStartE2EDuration="2.137552356s" podCreationTimestamp="2025-10-08 22:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:45:21.119765664 +0000 UTC m=+1328.942650410" watchObservedRunningTime="2025-10-08 22:45:21.137552356 +0000 UTC m=+1328.960437122" Oct 08 22:45:21 crc kubenswrapper[4834]: I1008 22:45:21.311513 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:45:21 crc kubenswrapper[4834]: I1008 22:45:21.569723 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb961381-3f4f-4192-b7de-098fd403ff53" path="/var/lib/kubelet/pods/cb961381-3f4f-4192-b7de-098fd403ff53/volumes" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.072487 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.105199 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c2a0af76-da3c-4c64-a3d2-12470eb473a5","Type":"ContainerStarted","Data":"91c7c6ed9c5bfb8ed570f28276c62ad0cbb8c13ada235547ff112ef99e1314d3"} Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.105654 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c2a0af76-da3c-4c64-a3d2-12470eb473a5","Type":"ContainerStarted","Data":"8b2c7d5e0f34b533294f4a29ca31f296e3047ff84ee08a4176cbfba2c3a27dae"} Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.108788 4834 generic.go:334] "Generic (PLEG): container finished" podID="14fc6262-b8ce-4cd4-927f-b78590123bd9" containerID="45c7fca60ecb1628b799cfb660b7c483495d50fb85a855c6659c32c2439ce7c7" exitCode=0 Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.108835 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.108893 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14fc6262-b8ce-4cd4-927f-b78590123bd9","Type":"ContainerDied","Data":"45c7fca60ecb1628b799cfb660b7c483495d50fb85a855c6659c32c2439ce7c7"} Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.108921 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"14fc6262-b8ce-4cd4-927f-b78590123bd9","Type":"ContainerDied","Data":"339318a0149ac3b3ffa96febb8fd15ab911f0aa751949fa844dca444e58d7780"} Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.108941 4834 scope.go:117] "RemoveContainer" containerID="45c7fca60ecb1628b799cfb660b7c483495d50fb85a855c6659c32c2439ce7c7" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.129571 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.129545057 podStartE2EDuration="2.129545057s" podCreationTimestamp="2025-10-08 22:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:45:22.123521862 +0000 UTC m=+1329.946406608" watchObservedRunningTime="2025-10-08 22:45:22.129545057 +0000 UTC m=+1329.952429803" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.156730 4834 scope.go:117] "RemoveContainer" containerID="33bda31da78488e9ed5d034251e043790a436af05147cf8111d934680ab121e7" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.182537 4834 scope.go:117] "RemoveContainer" containerID="45c7fca60ecb1628b799cfb660b7c483495d50fb85a855c6659c32c2439ce7c7" Oct 08 22:45:22 crc kubenswrapper[4834]: E1008 22:45:22.183081 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45c7fca60ecb1628b799cfb660b7c483495d50fb85a855c6659c32c2439ce7c7\": container with ID starting with 45c7fca60ecb1628b799cfb660b7c483495d50fb85a855c6659c32c2439ce7c7 not found: ID does not exist" containerID="45c7fca60ecb1628b799cfb660b7c483495d50fb85a855c6659c32c2439ce7c7" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.183128 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45c7fca60ecb1628b799cfb660b7c483495d50fb85a855c6659c32c2439ce7c7"} err="failed to get container status \"45c7fca60ecb1628b799cfb660b7c483495d50fb85a855c6659c32c2439ce7c7\": rpc error: code = NotFound desc = could not find container \"45c7fca60ecb1628b799cfb660b7c483495d50fb85a855c6659c32c2439ce7c7\": container with ID starting with 45c7fca60ecb1628b799cfb660b7c483495d50fb85a855c6659c32c2439ce7c7 not found: ID does not exist" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.183180 4834 scope.go:117] "RemoveContainer" containerID="33bda31da78488e9ed5d034251e043790a436af05147cf8111d934680ab121e7" Oct 08 22:45:22 crc kubenswrapper[4834]: E1008 22:45:22.183609 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33bda31da78488e9ed5d034251e043790a436af05147cf8111d934680ab121e7\": container with ID starting with 33bda31da78488e9ed5d034251e043790a436af05147cf8111d934680ab121e7 not found: ID does not exist" containerID="33bda31da78488e9ed5d034251e043790a436af05147cf8111d934680ab121e7" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.183649 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33bda31da78488e9ed5d034251e043790a436af05147cf8111d934680ab121e7"} err="failed to get container status \"33bda31da78488e9ed5d034251e043790a436af05147cf8111d934680ab121e7\": rpc error: code = NotFound desc = could not find container \"33bda31da78488e9ed5d034251e043790a436af05147cf8111d934680ab121e7\": container with ID starting with 33bda31da78488e9ed5d034251e043790a436af05147cf8111d934680ab121e7 not found: ID does not exist" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.231229 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14fc6262-b8ce-4cd4-927f-b78590123bd9-combined-ca-bundle\") pod \"14fc6262-b8ce-4cd4-927f-b78590123bd9\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.231329 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14fc6262-b8ce-4cd4-927f-b78590123bd9-logs\") pod \"14fc6262-b8ce-4cd4-927f-b78590123bd9\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.231368 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14fc6262-b8ce-4cd4-927f-b78590123bd9-config-data\") pod \"14fc6262-b8ce-4cd4-927f-b78590123bd9\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.231540 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk5xw\" (UniqueName: \"kubernetes.io/projected/14fc6262-b8ce-4cd4-927f-b78590123bd9-kube-api-access-zk5xw\") pod \"14fc6262-b8ce-4cd4-927f-b78590123bd9\" (UID: \"14fc6262-b8ce-4cd4-927f-b78590123bd9\") " Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.236079 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14fc6262-b8ce-4cd4-927f-b78590123bd9-logs" (OuterVolumeSpecName: "logs") pod "14fc6262-b8ce-4cd4-927f-b78590123bd9" (UID: "14fc6262-b8ce-4cd4-927f-b78590123bd9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.242428 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14fc6262-b8ce-4cd4-927f-b78590123bd9-kube-api-access-zk5xw" (OuterVolumeSpecName: "kube-api-access-zk5xw") pod "14fc6262-b8ce-4cd4-927f-b78590123bd9" (UID: "14fc6262-b8ce-4cd4-927f-b78590123bd9"). InnerVolumeSpecName "kube-api-access-zk5xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.271556 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14fc6262-b8ce-4cd4-927f-b78590123bd9-config-data" (OuterVolumeSpecName: "config-data") pod "14fc6262-b8ce-4cd4-927f-b78590123bd9" (UID: "14fc6262-b8ce-4cd4-927f-b78590123bd9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.282776 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14fc6262-b8ce-4cd4-927f-b78590123bd9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14fc6262-b8ce-4cd4-927f-b78590123bd9" (UID: "14fc6262-b8ce-4cd4-927f-b78590123bd9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.334497 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14fc6262-b8ce-4cd4-927f-b78590123bd9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.334548 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14fc6262-b8ce-4cd4-927f-b78590123bd9-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.334560 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14fc6262-b8ce-4cd4-927f-b78590123bd9-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.334750 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk5xw\" (UniqueName: \"kubernetes.io/projected/14fc6262-b8ce-4cd4-927f-b78590123bd9-kube-api-access-zk5xw\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.469540 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.479165 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.489400 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:22 crc kubenswrapper[4834]: E1008 22:45:22.489801 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14fc6262-b8ce-4cd4-927f-b78590123bd9" containerName="nova-api-log" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.489818 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="14fc6262-b8ce-4cd4-927f-b78590123bd9" containerName="nova-api-log" Oct 08 22:45:22 crc kubenswrapper[4834]: E1008 22:45:22.489842 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14fc6262-b8ce-4cd4-927f-b78590123bd9" containerName="nova-api-api" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.489849 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="14fc6262-b8ce-4cd4-927f-b78590123bd9" containerName="nova-api-api" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.490078 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="14fc6262-b8ce-4cd4-927f-b78590123bd9" containerName="nova-api-api" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.490095 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="14fc6262-b8ce-4cd4-927f-b78590123bd9" containerName="nova-api-log" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.491227 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.497553 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.503915 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.548752 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vzlq\" (UniqueName: \"kubernetes.io/projected/4224e6e1-de11-4651-953c-e37ab41c9442-kube-api-access-7vzlq\") pod \"nova-api-0\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.549057 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4224e6e1-de11-4651-953c-e37ab41c9442-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.549129 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4224e6e1-de11-4651-953c-e37ab41c9442-config-data\") pod \"nova-api-0\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.549221 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4224e6e1-de11-4651-953c-e37ab41c9442-logs\") pod \"nova-api-0\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.651865 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4224e6e1-de11-4651-953c-e37ab41c9442-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.651970 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4224e6e1-de11-4651-953c-e37ab41c9442-config-data\") pod \"nova-api-0\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.652007 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4224e6e1-de11-4651-953c-e37ab41c9442-logs\") pod \"nova-api-0\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.652076 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vzlq\" (UniqueName: \"kubernetes.io/projected/4224e6e1-de11-4651-953c-e37ab41c9442-kube-api-access-7vzlq\") pod \"nova-api-0\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.652857 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4224e6e1-de11-4651-953c-e37ab41c9442-logs\") pod \"nova-api-0\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.658295 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4224e6e1-de11-4651-953c-e37ab41c9442-config-data\") pod \"nova-api-0\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.659951 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4224e6e1-de11-4651-953c-e37ab41c9442-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.670387 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vzlq\" (UniqueName: \"kubernetes.io/projected/4224e6e1-de11-4651-953c-e37ab41c9442-kube-api-access-7vzlq\") pod \"nova-api-0\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " pod="openstack/nova-api-0" Oct 08 22:45:22 crc kubenswrapper[4834]: I1008 22:45:22.810183 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:45:23 crc kubenswrapper[4834]: I1008 22:45:23.349943 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:23 crc kubenswrapper[4834]: W1008 22:45:23.360192 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4224e6e1_de11_4651_953c_e37ab41c9442.slice/crio-fe29e24e22b8c1e7cc0a9001d1c755186f8c81ad82daf38dd80e284c0481c9d8 WatchSource:0}: Error finding container fe29e24e22b8c1e7cc0a9001d1c755186f8c81ad82daf38dd80e284c0481c9d8: Status 404 returned error can't find the container with id fe29e24e22b8c1e7cc0a9001d1c755186f8c81ad82daf38dd80e284c0481c9d8 Oct 08 22:45:23 crc kubenswrapper[4834]: I1008 22:45:23.571083 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14fc6262-b8ce-4cd4-927f-b78590123bd9" path="/var/lib/kubelet/pods/14fc6262-b8ce-4cd4-927f-b78590123bd9/volumes" Oct 08 22:45:24 crc kubenswrapper[4834]: I1008 22:45:24.132879 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4224e6e1-de11-4651-953c-e37ab41c9442","Type":"ContainerStarted","Data":"27a388fba3857a974bd3f0eb5db4d9be80d649e0998174771562415a083fe77f"} Oct 08 22:45:24 crc kubenswrapper[4834]: I1008 22:45:24.132926 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4224e6e1-de11-4651-953c-e37ab41c9442","Type":"ContainerStarted","Data":"4f3b77f524f507879549841218040a4bba1986e7529b83b977a602509a87e080"} Oct 08 22:45:24 crc kubenswrapper[4834]: I1008 22:45:24.132939 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4224e6e1-de11-4651-953c-e37ab41c9442","Type":"ContainerStarted","Data":"fe29e24e22b8c1e7cc0a9001d1c755186f8c81ad82daf38dd80e284c0481c9d8"} Oct 08 22:45:24 crc kubenswrapper[4834]: I1008 22:45:24.151429 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.15140756 podStartE2EDuration="2.15140756s" podCreationTimestamp="2025-10-08 22:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:45:24.149017592 +0000 UTC m=+1331.971902348" watchObservedRunningTime="2025-10-08 22:45:24.15140756 +0000 UTC m=+1331.974292316" Oct 08 22:45:24 crc kubenswrapper[4834]: I1008 22:45:24.530437 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 22:45:24 crc kubenswrapper[4834]: I1008 22:45:24.530852 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 22:45:25 crc kubenswrapper[4834]: I1008 22:45:25.764253 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 22:45:29 crc kubenswrapper[4834]: I1008 22:45:29.496866 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 08 22:45:29 crc kubenswrapper[4834]: I1008 22:45:29.531260 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 22:45:29 crc kubenswrapper[4834]: I1008 22:45:29.532588 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 22:45:30 crc kubenswrapper[4834]: I1008 22:45:30.543281 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e088e5ae-16c2-4a56-a140-3159b429ad55" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 22:45:30 crc kubenswrapper[4834]: I1008 22:45:30.543366 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e088e5ae-16c2-4a56-a140-3159b429ad55" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 22:45:30 crc kubenswrapper[4834]: I1008 22:45:30.765194 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 22:45:30 crc kubenswrapper[4834]: I1008 22:45:30.797736 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 22:45:31 crc kubenswrapper[4834]: I1008 22:45:31.135192 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 22:45:31 crc kubenswrapper[4834]: I1008 22:45:31.268765 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 22:45:32 crc kubenswrapper[4834]: I1008 22:45:32.812315 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:45:32 crc kubenswrapper[4834]: I1008 22:45:32.812696 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:45:33 crc kubenswrapper[4834]: I1008 22:45:33.895454 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4224e6e1-de11-4651-953c-e37ab41c9442" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 22:45:33 crc kubenswrapper[4834]: I1008 22:45:33.895489 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4224e6e1-de11-4651-953c-e37ab41c9442" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 08 22:45:34 crc kubenswrapper[4834]: I1008 22:45:34.782513 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:45:34 crc kubenswrapper[4834]: I1008 22:45:34.782743 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="53daad82-5614-475c-b3c0-95329fc7d46a" containerName="kube-state-metrics" containerID="cri-o://ca24648ca7cb1d935733fd8d6a5dae57fa7a4565f69f1abcdbe7a629bc83a777" gracePeriod=30 Oct 08 22:45:35 crc kubenswrapper[4834]: I1008 22:45:35.276183 4834 generic.go:334] "Generic (PLEG): container finished" podID="53daad82-5614-475c-b3c0-95329fc7d46a" containerID="ca24648ca7cb1d935733fd8d6a5dae57fa7a4565f69f1abcdbe7a629bc83a777" exitCode=2 Oct 08 22:45:35 crc kubenswrapper[4834]: I1008 22:45:35.276586 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"53daad82-5614-475c-b3c0-95329fc7d46a","Type":"ContainerDied","Data":"ca24648ca7cb1d935733fd8d6a5dae57fa7a4565f69f1abcdbe7a629bc83a777"} Oct 08 22:45:35 crc kubenswrapper[4834]: I1008 22:45:35.276612 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"53daad82-5614-475c-b3c0-95329fc7d46a","Type":"ContainerDied","Data":"2362b88f1267a90efa20c80f39e46604f59242052ff670c98e29520f066ec3e1"} Oct 08 22:45:35 crc kubenswrapper[4834]: I1008 22:45:35.276625 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2362b88f1267a90efa20c80f39e46604f59242052ff670c98e29520f066ec3e1" Oct 08 22:45:35 crc kubenswrapper[4834]: I1008 22:45:35.337535 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:45:35 crc kubenswrapper[4834]: I1008 22:45:35.455246 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2kf6\" (UniqueName: \"kubernetes.io/projected/53daad82-5614-475c-b3c0-95329fc7d46a-kube-api-access-t2kf6\") pod \"53daad82-5614-475c-b3c0-95329fc7d46a\" (UID: \"53daad82-5614-475c-b3c0-95329fc7d46a\") " Oct 08 22:45:35 crc kubenswrapper[4834]: I1008 22:45:35.466367 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53daad82-5614-475c-b3c0-95329fc7d46a-kube-api-access-t2kf6" (OuterVolumeSpecName: "kube-api-access-t2kf6") pod "53daad82-5614-475c-b3c0-95329fc7d46a" (UID: "53daad82-5614-475c-b3c0-95329fc7d46a"). InnerVolumeSpecName "kube-api-access-t2kf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:35 crc kubenswrapper[4834]: I1008 22:45:35.557564 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2kf6\" (UniqueName: \"kubernetes.io/projected/53daad82-5614-475c-b3c0-95329fc7d46a-kube-api-access-t2kf6\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.313264 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.347172 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.364175 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.375846 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:45:36 crc kubenswrapper[4834]: E1008 22:45:36.376361 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53daad82-5614-475c-b3c0-95329fc7d46a" containerName="kube-state-metrics" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.376384 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="53daad82-5614-475c-b3c0-95329fc7d46a" containerName="kube-state-metrics" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.376625 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="53daad82-5614-475c-b3c0-95329fc7d46a" containerName="kube-state-metrics" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.377492 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.379588 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.383763 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.388688 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.576664 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " pod="openstack/kube-state-metrics-0" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.577403 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjghd\" (UniqueName: \"kubernetes.io/projected/8b73c297-7a02-46b4-88bf-30b239655df8-kube-api-access-pjghd\") pod \"kube-state-metrics-0\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " pod="openstack/kube-state-metrics-0" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.577637 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " pod="openstack/kube-state-metrics-0" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.577745 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " pod="openstack/kube-state-metrics-0" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.680215 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjghd\" (UniqueName: \"kubernetes.io/projected/8b73c297-7a02-46b4-88bf-30b239655df8-kube-api-access-pjghd\") pod \"kube-state-metrics-0\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " pod="openstack/kube-state-metrics-0" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.680404 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " pod="openstack/kube-state-metrics-0" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.680533 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " pod="openstack/kube-state-metrics-0" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.684256 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " pod="openstack/kube-state-metrics-0" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.688529 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " pod="openstack/kube-state-metrics-0" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.689791 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " pod="openstack/kube-state-metrics-0" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.704459 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " pod="openstack/kube-state-metrics-0" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.706736 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjghd\" (UniqueName: \"kubernetes.io/projected/8b73c297-7a02-46b4-88bf-30b239655df8-kube-api-access-pjghd\") pod \"kube-state-metrics-0\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " pod="openstack/kube-state-metrics-0" Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.747749 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.748184 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="ceilometer-central-agent" containerID="cri-o://1bc1d76011d94b5ead51f990a8ced182f57e49d4ceefb6a78b1334ed5940b7a9" gracePeriod=30 Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.748250 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="sg-core" containerID="cri-o://d9b4ddc679f848c2a87d22c9b36af537436a290dddee3124cc669e12dc609eaa" gracePeriod=30 Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.748294 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="proxy-httpd" containerID="cri-o://cfc0275a1c70a5dc176a8c69b72aa2ee4c3c1bead2405ddbd71e516b638ac75d" gracePeriod=30 Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.748370 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="ceilometer-notification-agent" containerID="cri-o://90dbfe82b44f65cc0540ee995c758612740536708b5b33999e93e7decd7f1461" gracePeriod=30 Oct 08 22:45:36 crc kubenswrapper[4834]: I1008 22:45:36.997663 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:45:37 crc kubenswrapper[4834]: I1008 22:45:37.329850 4834 generic.go:334] "Generic (PLEG): container finished" podID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerID="cfc0275a1c70a5dc176a8c69b72aa2ee4c3c1bead2405ddbd71e516b638ac75d" exitCode=0 Oct 08 22:45:37 crc kubenswrapper[4834]: I1008 22:45:37.330113 4834 generic.go:334] "Generic (PLEG): container finished" podID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerID="d9b4ddc679f848c2a87d22c9b36af537436a290dddee3124cc669e12dc609eaa" exitCode=2 Oct 08 22:45:37 crc kubenswrapper[4834]: I1008 22:45:37.330121 4834 generic.go:334] "Generic (PLEG): container finished" podID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerID="1bc1d76011d94b5ead51f990a8ced182f57e49d4ceefb6a78b1334ed5940b7a9" exitCode=0 Oct 08 22:45:37 crc kubenswrapper[4834]: I1008 22:45:37.329935 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61bdc81-d45d-44dd-a286-10811d22efa2","Type":"ContainerDied","Data":"cfc0275a1c70a5dc176a8c69b72aa2ee4c3c1bead2405ddbd71e516b638ac75d"} Oct 08 22:45:37 crc kubenswrapper[4834]: I1008 22:45:37.330169 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61bdc81-d45d-44dd-a286-10811d22efa2","Type":"ContainerDied","Data":"d9b4ddc679f848c2a87d22c9b36af537436a290dddee3124cc669e12dc609eaa"} Oct 08 22:45:37 crc kubenswrapper[4834]: I1008 22:45:37.330182 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61bdc81-d45d-44dd-a286-10811d22efa2","Type":"ContainerDied","Data":"1bc1d76011d94b5ead51f990a8ced182f57e49d4ceefb6a78b1334ed5940b7a9"} Oct 08 22:45:37 crc kubenswrapper[4834]: I1008 22:45:37.505129 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:45:37 crc kubenswrapper[4834]: W1008 22:45:37.505162 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b73c297_7a02_46b4_88bf_30b239655df8.slice/crio-aefb7fc7aa9622954b805fb5d9c58e984502b890dd2f99f69843b5ea26b5e1ff WatchSource:0}: Error finding container aefb7fc7aa9622954b805fb5d9c58e984502b890dd2f99f69843b5ea26b5e1ff: Status 404 returned error can't find the container with id aefb7fc7aa9622954b805fb5d9c58e984502b890dd2f99f69843b5ea26b5e1ff Oct 08 22:45:37 crc kubenswrapper[4834]: I1008 22:45:37.565666 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53daad82-5614-475c-b3c0-95329fc7d46a" path="/var/lib/kubelet/pods/53daad82-5614-475c-b3c0-95329fc7d46a/volumes" Oct 08 22:45:38 crc kubenswrapper[4834]: I1008 22:45:38.340904 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8b73c297-7a02-46b4-88bf-30b239655df8","Type":"ContainerStarted","Data":"e4758aa91899dd39f5a62ab08ecb06485dc5e662e93954ee17513d9d7aaa9349"} Oct 08 22:45:38 crc kubenswrapper[4834]: I1008 22:45:38.341316 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 08 22:45:38 crc kubenswrapper[4834]: I1008 22:45:38.341329 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8b73c297-7a02-46b4-88bf-30b239655df8","Type":"ContainerStarted","Data":"aefb7fc7aa9622954b805fb5d9c58e984502b890dd2f99f69843b5ea26b5e1ff"} Oct 08 22:45:38 crc kubenswrapper[4834]: I1008 22:45:38.365657 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.943935543 podStartE2EDuration="2.365635271s" podCreationTimestamp="2025-10-08 22:45:36 +0000 UTC" firstStartedPulling="2025-10-08 22:45:37.50762723 +0000 UTC m=+1345.330511976" lastFinishedPulling="2025-10-08 22:45:37.929326918 +0000 UTC m=+1345.752211704" observedRunningTime="2025-10-08 22:45:38.360358803 +0000 UTC m=+1346.183243629" watchObservedRunningTime="2025-10-08 22:45:38.365635271 +0000 UTC m=+1346.188520017" Oct 08 22:45:39 crc kubenswrapper[4834]: I1008 22:45:39.538459 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 22:45:39 crc kubenswrapper[4834]: I1008 22:45:39.574703 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 22:45:39 crc kubenswrapper[4834]: I1008 22:45:39.574867 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.369197 4834 generic.go:334] "Generic (PLEG): container finished" podID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerID="90dbfe82b44f65cc0540ee995c758612740536708b5b33999e93e7decd7f1461" exitCode=0 Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.369986 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61bdc81-d45d-44dd-a286-10811d22efa2","Type":"ContainerDied","Data":"90dbfe82b44f65cc0540ee995c758612740536708b5b33999e93e7decd7f1461"} Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.382489 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.631121 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.774557 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61bdc81-d45d-44dd-a286-10811d22efa2-log-httpd\") pod \"c61bdc81-d45d-44dd-a286-10811d22efa2\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.774615 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-combined-ca-bundle\") pod \"c61bdc81-d45d-44dd-a286-10811d22efa2\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.774637 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-config-data\") pod \"c61bdc81-d45d-44dd-a286-10811d22efa2\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.774669 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-scripts\") pod \"c61bdc81-d45d-44dd-a286-10811d22efa2\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.774716 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61bdc81-d45d-44dd-a286-10811d22efa2-run-httpd\") pod \"c61bdc81-d45d-44dd-a286-10811d22efa2\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.774764 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bsmk\" (UniqueName: \"kubernetes.io/projected/c61bdc81-d45d-44dd-a286-10811d22efa2-kube-api-access-4bsmk\") pod \"c61bdc81-d45d-44dd-a286-10811d22efa2\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.774815 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-sg-core-conf-yaml\") pod \"c61bdc81-d45d-44dd-a286-10811d22efa2\" (UID: \"c61bdc81-d45d-44dd-a286-10811d22efa2\") " Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.775071 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c61bdc81-d45d-44dd-a286-10811d22efa2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c61bdc81-d45d-44dd-a286-10811d22efa2" (UID: "c61bdc81-d45d-44dd-a286-10811d22efa2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.775273 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61bdc81-d45d-44dd-a286-10811d22efa2-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.775685 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c61bdc81-d45d-44dd-a286-10811d22efa2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c61bdc81-d45d-44dd-a286-10811d22efa2" (UID: "c61bdc81-d45d-44dd-a286-10811d22efa2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.783073 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c61bdc81-d45d-44dd-a286-10811d22efa2-kube-api-access-4bsmk" (OuterVolumeSpecName: "kube-api-access-4bsmk") pod "c61bdc81-d45d-44dd-a286-10811d22efa2" (UID: "c61bdc81-d45d-44dd-a286-10811d22efa2"). InnerVolumeSpecName "kube-api-access-4bsmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.796423 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-scripts" (OuterVolumeSpecName: "scripts") pod "c61bdc81-d45d-44dd-a286-10811d22efa2" (UID: "c61bdc81-d45d-44dd-a286-10811d22efa2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.803319 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c61bdc81-d45d-44dd-a286-10811d22efa2" (UID: "c61bdc81-d45d-44dd-a286-10811d22efa2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.853627 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c61bdc81-d45d-44dd-a286-10811d22efa2" (UID: "c61bdc81-d45d-44dd-a286-10811d22efa2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.880501 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.880531 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.880540 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c61bdc81-d45d-44dd-a286-10811d22efa2-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.880549 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bsmk\" (UniqueName: \"kubernetes.io/projected/c61bdc81-d45d-44dd-a286-10811d22efa2-kube-api-access-4bsmk\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.880560 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.916436 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-config-data" (OuterVolumeSpecName: "config-data") pod "c61bdc81-d45d-44dd-a286-10811d22efa2" (UID: "c61bdc81-d45d-44dd-a286-10811d22efa2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:40 crc kubenswrapper[4834]: I1008 22:45:40.982831 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61bdc81-d45d-44dd-a286-10811d22efa2-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.380414 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c61bdc81-d45d-44dd-a286-10811d22efa2","Type":"ContainerDied","Data":"cddbc6d8532e3193d1e703a2192ae63d80da3ac7610233514ba01dcc5811e2ec"} Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.380870 4834 scope.go:117] "RemoveContainer" containerID="cfc0275a1c70a5dc176a8c69b72aa2ee4c3c1bead2405ddbd71e516b638ac75d" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.380435 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.411317 4834 scope.go:117] "RemoveContainer" containerID="d9b4ddc679f848c2a87d22c9b36af537436a290dddee3124cc669e12dc609eaa" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.421129 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.429506 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.439266 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:41 crc kubenswrapper[4834]: E1008 22:45:41.439663 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="ceilometer-central-agent" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.439686 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="ceilometer-central-agent" Oct 08 22:45:41 crc kubenswrapper[4834]: E1008 22:45:41.439699 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="proxy-httpd" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.439705 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="proxy-httpd" Oct 08 22:45:41 crc kubenswrapper[4834]: E1008 22:45:41.439722 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="sg-core" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.439729 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="sg-core" Oct 08 22:45:41 crc kubenswrapper[4834]: E1008 22:45:41.439757 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="ceilometer-notification-agent" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.439765 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="ceilometer-notification-agent" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.439933 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="sg-core" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.439944 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="proxy-httpd" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.439966 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="ceilometer-central-agent" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.439976 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" containerName="ceilometer-notification-agent" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.441833 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.445810 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.446765 4834 scope.go:117] "RemoveContainer" containerID="90dbfe82b44f65cc0540ee995c758612740536708b5b33999e93e7decd7f1461" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.447109 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.448087 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.470701 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.477605 4834 scope.go:117] "RemoveContainer" containerID="1bc1d76011d94b5ead51f990a8ced182f57e49d4ceefb6a78b1334ed5940b7a9" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.578183 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c61bdc81-d45d-44dd-a286-10811d22efa2" path="/var/lib/kubelet/pods/c61bdc81-d45d-44dd-a286-10811d22efa2/volumes" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.594303 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.594369 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.594450 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-scripts\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.594488 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-config-data\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.594513 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fqg2\" (UniqueName: \"kubernetes.io/projected/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-kube-api-access-8fqg2\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.594540 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.594575 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-run-httpd\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.594591 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-log-httpd\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.696349 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.696684 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.696847 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-scripts\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.696929 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-config-data\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.697063 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fqg2\" (UniqueName: \"kubernetes.io/projected/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-kube-api-access-8fqg2\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.697126 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.697271 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-log-httpd\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.697304 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-run-httpd\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.698735 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-run-httpd\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.699930 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-log-httpd\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.704753 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-scripts\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.706261 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-config-data\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.708603 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.711651 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.715060 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.719287 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fqg2\" (UniqueName: \"kubernetes.io/projected/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-kube-api-access-8fqg2\") pod \"ceilometer-0\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " pod="openstack/ceilometer-0" Oct 08 22:45:41 crc kubenswrapper[4834]: I1008 22:45:41.765889 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:45:42 crc kubenswrapper[4834]: I1008 22:45:42.269741 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:42 crc kubenswrapper[4834]: W1008 22:45:42.273438 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b4fb8a5_4888_4325_b4a8_1bda8b277ee8.slice/crio-8fd7248fb0f70036981156635e7361d5521098bdd4712d98b329034f06d13b93 WatchSource:0}: Error finding container 8fd7248fb0f70036981156635e7361d5521098bdd4712d98b329034f06d13b93: Status 404 returned error can't find the container with id 8fd7248fb0f70036981156635e7361d5521098bdd4712d98b329034f06d13b93 Oct 08 22:45:42 crc kubenswrapper[4834]: I1008 22:45:42.276010 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:45:42 crc kubenswrapper[4834]: I1008 22:45:42.396223 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8","Type":"ContainerStarted","Data":"8fd7248fb0f70036981156635e7361d5521098bdd4712d98b329034f06d13b93"} Oct 08 22:45:42 crc kubenswrapper[4834]: I1008 22:45:42.815594 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 22:45:42 crc kubenswrapper[4834]: I1008 22:45:42.817439 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 22:45:42 crc kubenswrapper[4834]: I1008 22:45:42.820731 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 22:45:42 crc kubenswrapper[4834]: I1008 22:45:42.821793 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.253667 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.327954 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fe0249-e25b-4912-b569-d8b16d8da682-config-data\") pod \"b7fe0249-e25b-4912-b569-d8b16d8da682\" (UID: \"b7fe0249-e25b-4912-b569-d8b16d8da682\") " Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.330060 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fe0249-e25b-4912-b569-d8b16d8da682-combined-ca-bundle\") pod \"b7fe0249-e25b-4912-b569-d8b16d8da682\" (UID: \"b7fe0249-e25b-4912-b569-d8b16d8da682\") " Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.330169 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq4r5\" (UniqueName: \"kubernetes.io/projected/b7fe0249-e25b-4912-b569-d8b16d8da682-kube-api-access-dq4r5\") pod \"b7fe0249-e25b-4912-b569-d8b16d8da682\" (UID: \"b7fe0249-e25b-4912-b569-d8b16d8da682\") " Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.335933 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7fe0249-e25b-4912-b569-d8b16d8da682-kube-api-access-dq4r5" (OuterVolumeSpecName: "kube-api-access-dq4r5") pod "b7fe0249-e25b-4912-b569-d8b16d8da682" (UID: "b7fe0249-e25b-4912-b569-d8b16d8da682"). InnerVolumeSpecName "kube-api-access-dq4r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.364385 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7fe0249-e25b-4912-b569-d8b16d8da682-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7fe0249-e25b-4912-b569-d8b16d8da682" (UID: "b7fe0249-e25b-4912-b569-d8b16d8da682"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.382021 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7fe0249-e25b-4912-b569-d8b16d8da682-config-data" (OuterVolumeSpecName: "config-data") pod "b7fe0249-e25b-4912-b569-d8b16d8da682" (UID: "b7fe0249-e25b-4912-b569-d8b16d8da682"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.411941 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8","Type":"ContainerStarted","Data":"ec67f4214f71facacfbfe0e55777b879349f7ab3bb68eeac356f8518175230a6"} Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.416261 4834 generic.go:334] "Generic (PLEG): container finished" podID="b7fe0249-e25b-4912-b569-d8b16d8da682" containerID="50165228f76efb3fa92eaca441d86781461722ecf9c41845ad36a3b89db2c064" exitCode=137 Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.417194 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7fe0249-e25b-4912-b569-d8b16d8da682","Type":"ContainerDied","Data":"50165228f76efb3fa92eaca441d86781461722ecf9c41845ad36a3b89db2c064"} Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.417230 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7fe0249-e25b-4912-b569-d8b16d8da682","Type":"ContainerDied","Data":"b5b3bc723da654909f550545019cb5a28d1a085d15135bed45d0fad921824391"} Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.417205 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.417419 4834 scope.go:117] "RemoveContainer" containerID="50165228f76efb3fa92eaca441d86781461722ecf9c41845ad36a3b89db2c064" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.417450 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.420718 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.434458 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7fe0249-e25b-4912-b569-d8b16d8da682-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.434698 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq4r5\" (UniqueName: \"kubernetes.io/projected/b7fe0249-e25b-4912-b569-d8b16d8da682-kube-api-access-dq4r5\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.434781 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7fe0249-e25b-4912-b569-d8b16d8da682-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.470190 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.476099 4834 scope.go:117] "RemoveContainer" containerID="50165228f76efb3fa92eaca441d86781461722ecf9c41845ad36a3b89db2c064" Oct 08 22:45:43 crc kubenswrapper[4834]: E1008 22:45:43.479601 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50165228f76efb3fa92eaca441d86781461722ecf9c41845ad36a3b89db2c064\": container with ID starting with 50165228f76efb3fa92eaca441d86781461722ecf9c41845ad36a3b89db2c064 not found: ID does not exist" containerID="50165228f76efb3fa92eaca441d86781461722ecf9c41845ad36a3b89db2c064" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.479655 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50165228f76efb3fa92eaca441d86781461722ecf9c41845ad36a3b89db2c064"} err="failed to get container status \"50165228f76efb3fa92eaca441d86781461722ecf9c41845ad36a3b89db2c064\": rpc error: code = NotFound desc = could not find container \"50165228f76efb3fa92eaca441d86781461722ecf9c41845ad36a3b89db2c064\": container with ID starting with 50165228f76efb3fa92eaca441d86781461722ecf9c41845ad36a3b89db2c064 not found: ID does not exist" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.481222 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.501125 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:45:43 crc kubenswrapper[4834]: E1008 22:45:43.502594 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7fe0249-e25b-4912-b569-d8b16d8da682" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.502682 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7fe0249-e25b-4912-b569-d8b16d8da682" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.502994 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7fe0249-e25b-4912-b569-d8b16d8da682" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.503891 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.509007 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.509391 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.509432 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.516342 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.618905 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7fe0249-e25b-4912-b569-d8b16d8da682" path="/var/lib/kubelet/pods/b7fe0249-e25b-4912-b569-d8b16d8da682/volumes" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.650978 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hpfd\" (UniqueName: \"kubernetes.io/projected/caf766d3-49fe-4a20-bf0e-405ccca15c69-kube-api-access-8hpfd\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.651111 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.651190 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.651220 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.651243 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.652554 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5967cc9597-s2zsg"] Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.654286 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.667644 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5967cc9597-s2zsg"] Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.753682 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hpfd\" (UniqueName: \"kubernetes.io/projected/caf766d3-49fe-4a20-bf0e-405ccca15c69-kube-api-access-8hpfd\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.753739 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-config\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.753776 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-ovsdbserver-nb\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.753841 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h299\" (UniqueName: \"kubernetes.io/projected/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-kube-api-access-4h299\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.753886 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-dns-swift-storage-0\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.753916 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.753974 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-dns-svc\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.754700 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.754786 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.754820 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.754867 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-ovsdbserver-sb\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.759884 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.760278 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.765764 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.768642 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.770972 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hpfd\" (UniqueName: \"kubernetes.io/projected/caf766d3-49fe-4a20-bf0e-405ccca15c69-kube-api-access-8hpfd\") pod \"nova-cell1-novncproxy-0\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.856251 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-config\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.856312 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-ovsdbserver-nb\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.856372 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h299\" (UniqueName: \"kubernetes.io/projected/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-kube-api-access-4h299\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.856403 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-dns-swift-storage-0\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.856455 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-dns-svc\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.856482 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-ovsdbserver-sb\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.857269 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-ovsdbserver-nb\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.857343 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-ovsdbserver-sb\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.857680 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-dns-svc\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.857938 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-config\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.859766 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-dns-swift-storage-0\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.873793 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h299\" (UniqueName: \"kubernetes.io/projected/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-kube-api-access-4h299\") pod \"dnsmasq-dns-5967cc9597-s2zsg\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:43 crc kubenswrapper[4834]: I1008 22:45:43.877667 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:44 crc kubenswrapper[4834]: I1008 22:45:43.999432 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:44 crc kubenswrapper[4834]: I1008 22:45:44.372600 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:45:44 crc kubenswrapper[4834]: W1008 22:45:44.381022 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaf766d3_49fe_4a20_bf0e_405ccca15c69.slice/crio-0b21c10d816c2c7153b5c45b8db39059b03941a188f9cac22497427fa550ca0b WatchSource:0}: Error finding container 0b21c10d816c2c7153b5c45b8db39059b03941a188f9cac22497427fa550ca0b: Status 404 returned error can't find the container with id 0b21c10d816c2c7153b5c45b8db39059b03941a188f9cac22497427fa550ca0b Oct 08 22:45:44 crc kubenswrapper[4834]: I1008 22:45:44.438917 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"caf766d3-49fe-4a20-bf0e-405ccca15c69","Type":"ContainerStarted","Data":"0b21c10d816c2c7153b5c45b8db39059b03941a188f9cac22497427fa550ca0b"} Oct 08 22:45:44 crc kubenswrapper[4834]: I1008 22:45:44.449497 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8","Type":"ContainerStarted","Data":"0ab3a7d115fb59330d073a21f51cad508e800c73d9b77f143cdf5eb31578fb21"} Oct 08 22:45:44 crc kubenswrapper[4834]: I1008 22:45:44.535029 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5967cc9597-s2zsg"] Oct 08 22:45:44 crc kubenswrapper[4834]: W1008 22:45:44.549111 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod596135ed_4d76_4dec_94bd_cf17dfbfe2d6.slice/crio-ab715cc62a51d5f54bcf374992049fea0483cd825e3d752f6638b2a06f69ed76 WatchSource:0}: Error finding container ab715cc62a51d5f54bcf374992049fea0483cd825e3d752f6638b2a06f69ed76: Status 404 returned error can't find the container with id ab715cc62a51d5f54bcf374992049fea0483cd825e3d752f6638b2a06f69ed76 Oct 08 22:45:45 crc kubenswrapper[4834]: I1008 22:45:45.464401 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"caf766d3-49fe-4a20-bf0e-405ccca15c69","Type":"ContainerStarted","Data":"fd7251bd930a8b2a1fa53071747b9a1c8fa4231c37dca7643bd1503481f77ec1"} Oct 08 22:45:45 crc kubenswrapper[4834]: I1008 22:45:45.469188 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8","Type":"ContainerStarted","Data":"dca6081c033626f7391c604360d982e6780577715c5e5e2af187cfa48761fdac"} Oct 08 22:45:45 crc kubenswrapper[4834]: I1008 22:45:45.470835 4834 generic.go:334] "Generic (PLEG): container finished" podID="596135ed-4d76-4dec-94bd-cf17dfbfe2d6" containerID="6b374ba9360aed2a69e514af5204d8b7f9a4d40ce6ebd3eac7796f0331f0d312" exitCode=0 Oct 08 22:45:45 crc kubenswrapper[4834]: I1008 22:45:45.471008 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" event={"ID":"596135ed-4d76-4dec-94bd-cf17dfbfe2d6","Type":"ContainerDied","Data":"6b374ba9360aed2a69e514af5204d8b7f9a4d40ce6ebd3eac7796f0331f0d312"} Oct 08 22:45:45 crc kubenswrapper[4834]: I1008 22:45:45.471040 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" event={"ID":"596135ed-4d76-4dec-94bd-cf17dfbfe2d6","Type":"ContainerStarted","Data":"ab715cc62a51d5f54bcf374992049fea0483cd825e3d752f6638b2a06f69ed76"} Oct 08 22:45:45 crc kubenswrapper[4834]: I1008 22:45:45.496474 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.496451857 podStartE2EDuration="2.496451857s" podCreationTimestamp="2025-10-08 22:45:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:45:45.491528538 +0000 UTC m=+1353.314413294" watchObservedRunningTime="2025-10-08 22:45:45.496451857 +0000 UTC m=+1353.319336603" Oct 08 22:45:46 crc kubenswrapper[4834]: I1008 22:45:46.273166 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:46 crc kubenswrapper[4834]: I1008 22:45:46.465733 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:46 crc kubenswrapper[4834]: I1008 22:45:46.483998 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" event={"ID":"596135ed-4d76-4dec-94bd-cf17dfbfe2d6","Type":"ContainerStarted","Data":"50d402217bd3c2796ec12c38a36a0c886b0ae574eb8874d5e4be8b40f8de8693"} Oct 08 22:45:46 crc kubenswrapper[4834]: I1008 22:45:46.484163 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:46 crc kubenswrapper[4834]: I1008 22:45:46.488031 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8","Type":"ContainerStarted","Data":"5dc9f577df840ccc374e9cf2a267ba3aa1f6db6918603ddc6f2f7f25f50caad4"} Oct 08 22:45:46 crc kubenswrapper[4834]: I1008 22:45:46.488262 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4224e6e1-de11-4651-953c-e37ab41c9442" containerName="nova-api-log" containerID="cri-o://4f3b77f524f507879549841218040a4bba1986e7529b83b977a602509a87e080" gracePeriod=30 Oct 08 22:45:46 crc kubenswrapper[4834]: I1008 22:45:46.488288 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:45:46 crc kubenswrapper[4834]: I1008 22:45:46.488323 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4224e6e1-de11-4651-953c-e37ab41c9442" containerName="nova-api-api" containerID="cri-o://27a388fba3857a974bd3f0eb5db4d9be80d649e0998174771562415a083fe77f" gracePeriod=30 Oct 08 22:45:46 crc kubenswrapper[4834]: I1008 22:45:46.501014 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" podStartSLOduration=3.500994724 podStartE2EDuration="3.500994724s" podCreationTimestamp="2025-10-08 22:45:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:45:46.499539149 +0000 UTC m=+1354.322423905" watchObservedRunningTime="2025-10-08 22:45:46.500994724 +0000 UTC m=+1354.323879470" Oct 08 22:45:46 crc kubenswrapper[4834]: I1008 22:45:46.546184 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9676384580000001 podStartE2EDuration="5.546135979s" podCreationTimestamp="2025-10-08 22:45:41 +0000 UTC" firstStartedPulling="2025-10-08 22:45:42.275822877 +0000 UTC m=+1350.098707623" lastFinishedPulling="2025-10-08 22:45:45.854320398 +0000 UTC m=+1353.677205144" observedRunningTime="2025-10-08 22:45:46.538961775 +0000 UTC m=+1354.361846521" watchObservedRunningTime="2025-10-08 22:45:46.546135979 +0000 UTC m=+1354.369020725" Oct 08 22:45:47 crc kubenswrapper[4834]: I1008 22:45:47.012104 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 08 22:45:47 crc kubenswrapper[4834]: I1008 22:45:47.025043 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:45:47 crc kubenswrapper[4834]: I1008 22:45:47.025097 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:45:47 crc kubenswrapper[4834]: I1008 22:45:47.504788 4834 generic.go:334] "Generic (PLEG): container finished" podID="4224e6e1-de11-4651-953c-e37ab41c9442" containerID="4f3b77f524f507879549841218040a4bba1986e7529b83b977a602509a87e080" exitCode=143 Oct 08 22:45:47 crc kubenswrapper[4834]: I1008 22:45:47.504852 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4224e6e1-de11-4651-953c-e37ab41c9442","Type":"ContainerDied","Data":"4f3b77f524f507879549841218040a4bba1986e7529b83b977a602509a87e080"} Oct 08 22:45:47 crc kubenswrapper[4834]: I1008 22:45:47.506005 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="ceilometer-central-agent" containerID="cri-o://ec67f4214f71facacfbfe0e55777b879349f7ab3bb68eeac356f8518175230a6" gracePeriod=30 Oct 08 22:45:47 crc kubenswrapper[4834]: I1008 22:45:47.506064 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="proxy-httpd" containerID="cri-o://5dc9f577df840ccc374e9cf2a267ba3aa1f6db6918603ddc6f2f7f25f50caad4" gracePeriod=30 Oct 08 22:45:47 crc kubenswrapper[4834]: I1008 22:45:47.506129 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="sg-core" containerID="cri-o://dca6081c033626f7391c604360d982e6780577715c5e5e2af187cfa48761fdac" gracePeriod=30 Oct 08 22:45:47 crc kubenswrapper[4834]: I1008 22:45:47.506223 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="ceilometer-notification-agent" containerID="cri-o://0ab3a7d115fb59330d073a21f51cad508e800c73d9b77f143cdf5eb31578fb21" gracePeriod=30 Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.532820 4834 generic.go:334] "Generic (PLEG): container finished" podID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerID="5dc9f577df840ccc374e9cf2a267ba3aa1f6db6918603ddc6f2f7f25f50caad4" exitCode=0 Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.533174 4834 generic.go:334] "Generic (PLEG): container finished" podID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerID="dca6081c033626f7391c604360d982e6780577715c5e5e2af187cfa48761fdac" exitCode=2 Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.532913 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8","Type":"ContainerDied","Data":"5dc9f577df840ccc374e9cf2a267ba3aa1f6db6918603ddc6f2f7f25f50caad4"} Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.533231 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8","Type":"ContainerDied","Data":"dca6081c033626f7391c604360d982e6780577715c5e5e2af187cfa48761fdac"} Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.533258 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8","Type":"ContainerDied","Data":"0ab3a7d115fb59330d073a21f51cad508e800c73d9b77f143cdf5eb31578fb21"} Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.533189 4834 generic.go:334] "Generic (PLEG): container finished" podID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerID="0ab3a7d115fb59330d073a21f51cad508e800c73d9b77f143cdf5eb31578fb21" exitCode=0 Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.533282 4834 generic.go:334] "Generic (PLEG): container finished" podID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerID="ec67f4214f71facacfbfe0e55777b879349f7ab3bb68eeac356f8518175230a6" exitCode=0 Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.533301 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8","Type":"ContainerDied","Data":"ec67f4214f71facacfbfe0e55777b879349f7ab3bb68eeac356f8518175230a6"} Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.742250 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.872707 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-scripts\") pod \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.872812 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-combined-ca-bundle\") pod \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.872877 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-config-data\") pod \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.872948 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fqg2\" (UniqueName: \"kubernetes.io/projected/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-kube-api-access-8fqg2\") pod \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.872996 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-log-httpd\") pod \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.873018 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-sg-core-conf-yaml\") pod \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.873064 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-run-httpd\") pod \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.873099 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-ceilometer-tls-certs\") pod \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\" (UID: \"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8\") " Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.874057 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" (UID: "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.874305 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" (UID: "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.878071 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.879659 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-kube-api-access-8fqg2" (OuterVolumeSpecName: "kube-api-access-8fqg2") pod "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" (UID: "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8"). InnerVolumeSpecName "kube-api-access-8fqg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.882356 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-scripts" (OuterVolumeSpecName: "scripts") pod "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" (UID: "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.910960 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" (UID: "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.933353 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" (UID: "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.957530 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" (UID: "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.977058 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fqg2\" (UniqueName: \"kubernetes.io/projected/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-kube-api-access-8fqg2\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.977092 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.977104 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.977116 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.977127 4834 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.977139 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:48 crc kubenswrapper[4834]: I1008 22:45:48.977214 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.008960 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-config-data" (OuterVolumeSpecName: "config-data") pod "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" (UID: "9b4fb8a5-4888-4325-b4a8-1bda8b277ee8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.079711 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.551846 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4fb8a5-4888-4325-b4a8-1bda8b277ee8","Type":"ContainerDied","Data":"8fd7248fb0f70036981156635e7361d5521098bdd4712d98b329034f06d13b93"} Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.551936 4834 scope.go:117] "RemoveContainer" containerID="5dc9f577df840ccc374e9cf2a267ba3aa1f6db6918603ddc6f2f7f25f50caad4" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.551932 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.606338 4834 scope.go:117] "RemoveContainer" containerID="dca6081c033626f7391c604360d982e6780577715c5e5e2af187cfa48761fdac" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.613539 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.624462 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.674396 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:49 crc kubenswrapper[4834]: E1008 22:45:49.674973 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="ceilometer-central-agent" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.674990 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="ceilometer-central-agent" Oct 08 22:45:49 crc kubenswrapper[4834]: E1008 22:45:49.675007 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="ceilometer-notification-agent" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.675014 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="ceilometer-notification-agent" Oct 08 22:45:49 crc kubenswrapper[4834]: E1008 22:45:49.675047 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="sg-core" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.675055 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="sg-core" Oct 08 22:45:49 crc kubenswrapper[4834]: E1008 22:45:49.675082 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="proxy-httpd" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.675088 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="proxy-httpd" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.675419 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="ceilometer-notification-agent" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.675441 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="ceilometer-central-agent" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.675459 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="proxy-httpd" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.675482 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" containerName="sg-core" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.676780 4834 scope.go:117] "RemoveContainer" containerID="0ab3a7d115fb59330d073a21f51cad508e800c73d9b77f143cdf5eb31578fb21" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.678079 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.680387 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.681073 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.681280 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.698407 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.792511 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2e3be8-465e-4b20-9586-387cd8d9ca67-run-httpd\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.792854 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.792891 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2e3be8-465e-4b20-9586-387cd8d9ca67-log-httpd\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.792950 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.792981 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.793044 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-config-data\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.793081 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfwmv\" (UniqueName: \"kubernetes.io/projected/ed2e3be8-465e-4b20-9586-387cd8d9ca67-kube-api-access-cfwmv\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.793131 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-scripts\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.799868 4834 scope.go:117] "RemoveContainer" containerID="ec67f4214f71facacfbfe0e55777b879349f7ab3bb68eeac356f8518175230a6" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.895062 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.895174 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-config-data\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.895205 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwmv\" (UniqueName: \"kubernetes.io/projected/ed2e3be8-465e-4b20-9586-387cd8d9ca67-kube-api-access-cfwmv\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.895241 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-scripts\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.895273 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2e3be8-465e-4b20-9586-387cd8d9ca67-run-httpd\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.895333 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.895370 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2e3be8-465e-4b20-9586-387cd8d9ca67-log-httpd\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.895442 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.902781 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-scripts\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.902953 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.903632 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2e3be8-465e-4b20-9586-387cd8d9ca67-run-httpd\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.904943 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.905270 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2e3be8-465e-4b20-9586-387cd8d9ca67-log-httpd\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.907732 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-config-data\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.909791 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:49 crc kubenswrapper[4834]: I1008 22:45:49.919837 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfwmv\" (UniqueName: \"kubernetes.io/projected/ed2e3be8-465e-4b20-9586-387cd8d9ca67-kube-api-access-cfwmv\") pod \"ceilometer-0\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " pod="openstack/ceilometer-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.040072 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.098781 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.209862 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4224e6e1-de11-4651-953c-e37ab41c9442-config-data\") pod \"4224e6e1-de11-4651-953c-e37ab41c9442\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.210389 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4224e6e1-de11-4651-953c-e37ab41c9442-logs\") pod \"4224e6e1-de11-4651-953c-e37ab41c9442\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.210473 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vzlq\" (UniqueName: \"kubernetes.io/projected/4224e6e1-de11-4651-953c-e37ab41c9442-kube-api-access-7vzlq\") pod \"4224e6e1-de11-4651-953c-e37ab41c9442\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.210531 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4224e6e1-de11-4651-953c-e37ab41c9442-combined-ca-bundle\") pod \"4224e6e1-de11-4651-953c-e37ab41c9442\" (UID: \"4224e6e1-de11-4651-953c-e37ab41c9442\") " Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.212695 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4224e6e1-de11-4651-953c-e37ab41c9442-logs" (OuterVolumeSpecName: "logs") pod "4224e6e1-de11-4651-953c-e37ab41c9442" (UID: "4224e6e1-de11-4651-953c-e37ab41c9442"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.224369 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4224e6e1-de11-4651-953c-e37ab41c9442-kube-api-access-7vzlq" (OuterVolumeSpecName: "kube-api-access-7vzlq") pod "4224e6e1-de11-4651-953c-e37ab41c9442" (UID: "4224e6e1-de11-4651-953c-e37ab41c9442"). InnerVolumeSpecName "kube-api-access-7vzlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.252814 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4224e6e1-de11-4651-953c-e37ab41c9442-config-data" (OuterVolumeSpecName: "config-data") pod "4224e6e1-de11-4651-953c-e37ab41c9442" (UID: "4224e6e1-de11-4651-953c-e37ab41c9442"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.261978 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4224e6e1-de11-4651-953c-e37ab41c9442-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4224e6e1-de11-4651-953c-e37ab41c9442" (UID: "4224e6e1-de11-4651-953c-e37ab41c9442"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.314750 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vzlq\" (UniqueName: \"kubernetes.io/projected/4224e6e1-de11-4651-953c-e37ab41c9442-kube-api-access-7vzlq\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.314787 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4224e6e1-de11-4651-953c-e37ab41c9442-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.314799 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4224e6e1-de11-4651-953c-e37ab41c9442-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.314810 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4224e6e1-de11-4651-953c-e37ab41c9442-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.563422 4834 generic.go:334] "Generic (PLEG): container finished" podID="4224e6e1-de11-4651-953c-e37ab41c9442" containerID="27a388fba3857a974bd3f0eb5db4d9be80d649e0998174771562415a083fe77f" exitCode=0 Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.563478 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4224e6e1-de11-4651-953c-e37ab41c9442","Type":"ContainerDied","Data":"27a388fba3857a974bd3f0eb5db4d9be80d649e0998174771562415a083fe77f"} Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.563501 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4224e6e1-de11-4651-953c-e37ab41c9442","Type":"ContainerDied","Data":"fe29e24e22b8c1e7cc0a9001d1c755186f8c81ad82daf38dd80e284c0481c9d8"} Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.563517 4834 scope.go:117] "RemoveContainer" containerID="27a388fba3857a974bd3f0eb5db4d9be80d649e0998174771562415a083fe77f" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.563599 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.605370 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.607019 4834 scope.go:117] "RemoveContainer" containerID="4f3b77f524f507879549841218040a4bba1986e7529b83b977a602509a87e080" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.614384 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.629738 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:50 crc kubenswrapper[4834]: E1008 22:45:50.630284 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4224e6e1-de11-4651-953c-e37ab41c9442" containerName="nova-api-api" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.630312 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4224e6e1-de11-4651-953c-e37ab41c9442" containerName="nova-api-api" Oct 08 22:45:50 crc kubenswrapper[4834]: E1008 22:45:50.630335 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4224e6e1-de11-4651-953c-e37ab41c9442" containerName="nova-api-log" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.630342 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4224e6e1-de11-4651-953c-e37ab41c9442" containerName="nova-api-log" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.630531 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4224e6e1-de11-4651-953c-e37ab41c9442" containerName="nova-api-api" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.630548 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4224e6e1-de11-4651-953c-e37ab41c9442" containerName="nova-api-log" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.631729 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.633689 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.635360 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.635523 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.654103 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.657765 4834 scope.go:117] "RemoveContainer" containerID="27a388fba3857a974bd3f0eb5db4d9be80d649e0998174771562415a083fe77f" Oct 08 22:45:50 crc kubenswrapper[4834]: E1008 22:45:50.661254 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27a388fba3857a974bd3f0eb5db4d9be80d649e0998174771562415a083fe77f\": container with ID starting with 27a388fba3857a974bd3f0eb5db4d9be80d649e0998174771562415a083fe77f not found: ID does not exist" containerID="27a388fba3857a974bd3f0eb5db4d9be80d649e0998174771562415a083fe77f" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.661291 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a388fba3857a974bd3f0eb5db4d9be80d649e0998174771562415a083fe77f"} err="failed to get container status \"27a388fba3857a974bd3f0eb5db4d9be80d649e0998174771562415a083fe77f\": rpc error: code = NotFound desc = could not find container \"27a388fba3857a974bd3f0eb5db4d9be80d649e0998174771562415a083fe77f\": container with ID starting with 27a388fba3857a974bd3f0eb5db4d9be80d649e0998174771562415a083fe77f not found: ID does not exist" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.661313 4834 scope.go:117] "RemoveContainer" containerID="4f3b77f524f507879549841218040a4bba1986e7529b83b977a602509a87e080" Oct 08 22:45:50 crc kubenswrapper[4834]: E1008 22:45:50.663281 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f3b77f524f507879549841218040a4bba1986e7529b83b977a602509a87e080\": container with ID starting with 4f3b77f524f507879549841218040a4bba1986e7529b83b977a602509a87e080 not found: ID does not exist" containerID="4f3b77f524f507879549841218040a4bba1986e7529b83b977a602509a87e080" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.663309 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f3b77f524f507879549841218040a4bba1986e7529b83b977a602509a87e080"} err="failed to get container status \"4f3b77f524f507879549841218040a4bba1986e7529b83b977a602509a87e080\": rpc error: code = NotFound desc = could not find container \"4f3b77f524f507879549841218040a4bba1986e7529b83b977a602509a87e080\": container with ID starting with 4f3b77f524f507879549841218040a4bba1986e7529b83b977a602509a87e080 not found: ID does not exist" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.689860 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:45:50 crc kubenswrapper[4834]: W1008 22:45:50.692345 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded2e3be8_465e_4b20_9586_387cd8d9ca67.slice/crio-b367283040987c5997b885caf9fbb3a88c0c67915cda01342f2bf40fc24e3a58 WatchSource:0}: Error finding container b367283040987c5997b885caf9fbb3a88c0c67915cda01342f2bf40fc24e3a58: Status 404 returned error can't find the container with id b367283040987c5997b885caf9fbb3a88c0c67915cda01342f2bf40fc24e3a58 Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.827316 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-config-data\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.827374 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6806419-9992-4f1f-9bbc-36a0d52340ca-logs\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.827400 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzz4h\" (UniqueName: \"kubernetes.io/projected/e6806419-9992-4f1f-9bbc-36a0d52340ca-kube-api-access-gzz4h\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.827433 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.827535 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.827586 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-public-tls-certs\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.929088 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-config-data\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.929137 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6806419-9992-4f1f-9bbc-36a0d52340ca-logs\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.929181 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzz4h\" (UniqueName: \"kubernetes.io/projected/e6806419-9992-4f1f-9bbc-36a0d52340ca-kube-api-access-gzz4h\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.929215 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.929283 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.929334 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-public-tls-certs\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.929982 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6806419-9992-4f1f-9bbc-36a0d52340ca-logs\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.935113 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.935263 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.935994 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-config-data\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.948620 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-public-tls-certs\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:50 crc kubenswrapper[4834]: I1008 22:45:50.951346 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzz4h\" (UniqueName: \"kubernetes.io/projected/e6806419-9992-4f1f-9bbc-36a0d52340ca-kube-api-access-gzz4h\") pod \"nova-api-0\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " pod="openstack/nova-api-0" Oct 08 22:45:51 crc kubenswrapper[4834]: I1008 22:45:51.058943 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:45:51 crc kubenswrapper[4834]: W1008 22:45:51.548930 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6806419_9992_4f1f_9bbc_36a0d52340ca.slice/crio-d608cc19d0d0fbfdd7408daa22ba79c5f50ab3be551d097b2fa89a4d43300ba2 WatchSource:0}: Error finding container d608cc19d0d0fbfdd7408daa22ba79c5f50ab3be551d097b2fa89a4d43300ba2: Status 404 returned error can't find the container with id d608cc19d0d0fbfdd7408daa22ba79c5f50ab3be551d097b2fa89a4d43300ba2 Oct 08 22:45:51 crc kubenswrapper[4834]: I1008 22:45:51.551385 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:45:51 crc kubenswrapper[4834]: I1008 22:45:51.604561 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4224e6e1-de11-4651-953c-e37ab41c9442" path="/var/lib/kubelet/pods/4224e6e1-de11-4651-953c-e37ab41c9442/volumes" Oct 08 22:45:51 crc kubenswrapper[4834]: I1008 22:45:51.605443 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4fb8a5-4888-4325-b4a8-1bda8b277ee8" path="/var/lib/kubelet/pods/9b4fb8a5-4888-4325-b4a8-1bda8b277ee8/volumes" Oct 08 22:45:51 crc kubenswrapper[4834]: I1008 22:45:51.606269 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6806419-9992-4f1f-9bbc-36a0d52340ca","Type":"ContainerStarted","Data":"d608cc19d0d0fbfdd7408daa22ba79c5f50ab3be551d097b2fa89a4d43300ba2"} Oct 08 22:45:51 crc kubenswrapper[4834]: I1008 22:45:51.606299 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2e3be8-465e-4b20-9586-387cd8d9ca67","Type":"ContainerStarted","Data":"647473fbbcb274c0e2ff66facfea42e5a01db54b90650c2eb3272770f97d4568"} Oct 08 22:45:51 crc kubenswrapper[4834]: I1008 22:45:51.606311 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2e3be8-465e-4b20-9586-387cd8d9ca67","Type":"ContainerStarted","Data":"b367283040987c5997b885caf9fbb3a88c0c67915cda01342f2bf40fc24e3a58"} Oct 08 22:45:52 crc kubenswrapper[4834]: I1008 22:45:52.618774 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2e3be8-465e-4b20-9586-387cd8d9ca67","Type":"ContainerStarted","Data":"cf0ae15b1941b4dd9f887d569bab53002471c7132e035bbb1f9a2f478ae20d66"} Oct 08 22:45:52 crc kubenswrapper[4834]: I1008 22:45:52.622705 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6806419-9992-4f1f-9bbc-36a0d52340ca","Type":"ContainerStarted","Data":"1229c847ebd47dc74a7d80badde533a7cbbcc001fda7ef8707bb039596a86a5d"} Oct 08 22:45:52 crc kubenswrapper[4834]: I1008 22:45:52.622743 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6806419-9992-4f1f-9bbc-36a0d52340ca","Type":"ContainerStarted","Data":"ffff8b102cbea74492faff4cfca190c1120b7bf0f34305db2f2ebb11b62d6d87"} Oct 08 22:45:52 crc kubenswrapper[4834]: I1008 22:45:52.651292 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.651264854 podStartE2EDuration="2.651264854s" podCreationTimestamp="2025-10-08 22:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:45:52.64282381 +0000 UTC m=+1360.465708566" watchObservedRunningTime="2025-10-08 22:45:52.651264854 +0000 UTC m=+1360.474149630" Oct 08 22:45:53 crc kubenswrapper[4834]: I1008 22:45:53.646305 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2e3be8-465e-4b20-9586-387cd8d9ca67","Type":"ContainerStarted","Data":"6c68965f390b060b5d3c620a960a9119d4e278229de82c1adb3ab74e578a87f2"} Oct 08 22:45:53 crc kubenswrapper[4834]: I1008 22:45:53.877949 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:53 crc kubenswrapper[4834]: I1008 22:45:53.923921 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.001121 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.083276 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64d8d96789-wxgvg"] Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.083521 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" podUID="7cbed812-c266-46ad-9b64-ccb83e6efb76" containerName="dnsmasq-dns" containerID="cri-o://17651b84fca04f45f755cc82539e7ab66b0bf27627a435857700c2da43a23544" gracePeriod=10 Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.664047 4834 generic.go:334] "Generic (PLEG): container finished" podID="7cbed812-c266-46ad-9b64-ccb83e6efb76" containerID="17651b84fca04f45f755cc82539e7ab66b0bf27627a435857700c2da43a23544" exitCode=0 Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.665791 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" event={"ID":"7cbed812-c266-46ad-9b64-ccb83e6efb76","Type":"ContainerDied","Data":"17651b84fca04f45f755cc82539e7ab66b0bf27627a435857700c2da43a23544"} Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.665821 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" event={"ID":"7cbed812-c266-46ad-9b64-ccb83e6efb76","Type":"ContainerDied","Data":"388385b0825a99d1b0ca0e34f609dadc874f098b3adf29cc150b6478018abd70"} Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.665831 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="388385b0825a99d1b0ca0e34f609dadc874f098b3adf29cc150b6478018abd70" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.682747 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.738369 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.858818 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6v4fx"] Oct 08 22:45:54 crc kubenswrapper[4834]: E1008 22:45:54.859516 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cbed812-c266-46ad-9b64-ccb83e6efb76" containerName="dnsmasq-dns" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.859533 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cbed812-c266-46ad-9b64-ccb83e6efb76" containerName="dnsmasq-dns" Oct 08 22:45:54 crc kubenswrapper[4834]: E1008 22:45:54.859565 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cbed812-c266-46ad-9b64-ccb83e6efb76" containerName="init" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.859571 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cbed812-c266-46ad-9b64-ccb83e6efb76" containerName="init" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.859923 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cbed812-c266-46ad-9b64-ccb83e6efb76" containerName="dnsmasq-dns" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.863695 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.868392 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.868460 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.881810 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6v4fx"] Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.918765 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-dns-svc\") pod \"7cbed812-c266-46ad-9b64-ccb83e6efb76\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.918856 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-ovsdbserver-sb\") pod \"7cbed812-c266-46ad-9b64-ccb83e6efb76\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.918961 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-ovsdbserver-nb\") pod \"7cbed812-c266-46ad-9b64-ccb83e6efb76\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.918984 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-dns-swift-storage-0\") pod \"7cbed812-c266-46ad-9b64-ccb83e6efb76\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.919057 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-config\") pod \"7cbed812-c266-46ad-9b64-ccb83e6efb76\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.919095 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmbkl\" (UniqueName: \"kubernetes.io/projected/7cbed812-c266-46ad-9b64-ccb83e6efb76-kube-api-access-vmbkl\") pod \"7cbed812-c266-46ad-9b64-ccb83e6efb76\" (UID: \"7cbed812-c266-46ad-9b64-ccb83e6efb76\") " Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.944612 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cbed812-c266-46ad-9b64-ccb83e6efb76-kube-api-access-vmbkl" (OuterVolumeSpecName: "kube-api-access-vmbkl") pod "7cbed812-c266-46ad-9b64-ccb83e6efb76" (UID: "7cbed812-c266-46ad-9b64-ccb83e6efb76"). InnerVolumeSpecName "kube-api-access-vmbkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.968197 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-config" (OuterVolumeSpecName: "config") pod "7cbed812-c266-46ad-9b64-ccb83e6efb76" (UID: "7cbed812-c266-46ad-9b64-ccb83e6efb76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.970788 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7cbed812-c266-46ad-9b64-ccb83e6efb76" (UID: "7cbed812-c266-46ad-9b64-ccb83e6efb76"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.971708 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7cbed812-c266-46ad-9b64-ccb83e6efb76" (UID: "7cbed812-c266-46ad-9b64-ccb83e6efb76"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.972000 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7cbed812-c266-46ad-9b64-ccb83e6efb76" (UID: "7cbed812-c266-46ad-9b64-ccb83e6efb76"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:45:54 crc kubenswrapper[4834]: I1008 22:45:54.990009 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7cbed812-c266-46ad-9b64-ccb83e6efb76" (UID: "7cbed812-c266-46ad-9b64-ccb83e6efb76"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.021215 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6v4fx\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.021352 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-config-data\") pod \"nova-cell1-cell-mapping-6v4fx\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.021382 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-scripts\") pod \"nova-cell1-cell-mapping-6v4fx\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.021415 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw2nb\" (UniqueName: \"kubernetes.io/projected/de233f0c-46d3-491e-83e6-b2334ba4ebd5-kube-api-access-kw2nb\") pod \"nova-cell1-cell-mapping-6v4fx\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.021523 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmbkl\" (UniqueName: \"kubernetes.io/projected/7cbed812-c266-46ad-9b64-ccb83e6efb76-kube-api-access-vmbkl\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.021538 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.021550 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.021563 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.021576 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.021586 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cbed812-c266-46ad-9b64-ccb83e6efb76-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.123092 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-config-data\") pod \"nova-cell1-cell-mapping-6v4fx\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.123568 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-scripts\") pod \"nova-cell1-cell-mapping-6v4fx\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.123613 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw2nb\" (UniqueName: \"kubernetes.io/projected/de233f0c-46d3-491e-83e6-b2334ba4ebd5-kube-api-access-kw2nb\") pod \"nova-cell1-cell-mapping-6v4fx\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.123737 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6v4fx\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.127773 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6v4fx\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.129494 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-scripts\") pod \"nova-cell1-cell-mapping-6v4fx\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.129861 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-config-data\") pod \"nova-cell1-cell-mapping-6v4fx\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.139647 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw2nb\" (UniqueName: \"kubernetes.io/projected/de233f0c-46d3-491e-83e6-b2334ba4ebd5-kube-api-access-kw2nb\") pod \"nova-cell1-cell-mapping-6v4fx\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.199686 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.684399 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64d8d96789-wxgvg" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.687327 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2e3be8-465e-4b20-9586-387cd8d9ca67","Type":"ContainerStarted","Data":"1032f4edc7088e996cbec692eba1d575d4cb67e5112593147d72e8f6a8aadd15"} Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.687515 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.728810 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.925669966 podStartE2EDuration="6.728789711s" podCreationTimestamp="2025-10-08 22:45:49 +0000 UTC" firstStartedPulling="2025-10-08 22:45:50.697542805 +0000 UTC m=+1358.520427551" lastFinishedPulling="2025-10-08 22:45:54.50066255 +0000 UTC m=+1362.323547296" observedRunningTime="2025-10-08 22:45:55.721781251 +0000 UTC m=+1363.544665997" watchObservedRunningTime="2025-10-08 22:45:55.728789711 +0000 UTC m=+1363.551674457" Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.735496 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6v4fx"] Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.770672 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64d8d96789-wxgvg"] Oct 08 22:45:55 crc kubenswrapper[4834]: I1008 22:45:55.780059 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64d8d96789-wxgvg"] Oct 08 22:45:56 crc kubenswrapper[4834]: I1008 22:45:56.700108 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6v4fx" event={"ID":"de233f0c-46d3-491e-83e6-b2334ba4ebd5","Type":"ContainerStarted","Data":"d772ec43c319ed37efddc72f55b2cad49aa8ee5fa72993e5c58323b2ce4d6343"} Oct 08 22:45:56 crc kubenswrapper[4834]: I1008 22:45:56.700770 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6v4fx" event={"ID":"de233f0c-46d3-491e-83e6-b2334ba4ebd5","Type":"ContainerStarted","Data":"45269a2a42b75fd6148948b13b7ac455a534c09e6960f0b919295836f9320b00"} Oct 08 22:45:56 crc kubenswrapper[4834]: I1008 22:45:56.734232 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6v4fx" podStartSLOduration=2.734215871 podStartE2EDuration="2.734215871s" podCreationTimestamp="2025-10-08 22:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:45:56.728399339 +0000 UTC m=+1364.551284095" watchObservedRunningTime="2025-10-08 22:45:56.734215871 +0000 UTC m=+1364.557100617" Oct 08 22:45:57 crc kubenswrapper[4834]: I1008 22:45:57.571294 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cbed812-c266-46ad-9b64-ccb83e6efb76" path="/var/lib/kubelet/pods/7cbed812-c266-46ad-9b64-ccb83e6efb76/volumes" Oct 08 22:46:00 crc kubenswrapper[4834]: I1008 22:46:00.771797 4834 generic.go:334] "Generic (PLEG): container finished" podID="de233f0c-46d3-491e-83e6-b2334ba4ebd5" containerID="d772ec43c319ed37efddc72f55b2cad49aa8ee5fa72993e5c58323b2ce4d6343" exitCode=0 Oct 08 22:46:00 crc kubenswrapper[4834]: I1008 22:46:00.772920 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6v4fx" event={"ID":"de233f0c-46d3-491e-83e6-b2334ba4ebd5","Type":"ContainerDied","Data":"d772ec43c319ed37efddc72f55b2cad49aa8ee5fa72993e5c58323b2ce4d6343"} Oct 08 22:46:01 crc kubenswrapper[4834]: I1008 22:46:01.059886 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:46:01 crc kubenswrapper[4834]: I1008 22:46:01.059942 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.075344 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e6806419-9992-4f1f-9bbc-36a0d52340ca" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.075405 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e6806419-9992-4f1f-9bbc-36a0d52340ca" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.178635 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.280900 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-scripts\") pod \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.280996 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-combined-ca-bundle\") pod \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.281051 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-config-data\") pod \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.281171 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw2nb\" (UniqueName: \"kubernetes.io/projected/de233f0c-46d3-491e-83e6-b2334ba4ebd5-kube-api-access-kw2nb\") pod \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\" (UID: \"de233f0c-46d3-491e-83e6-b2334ba4ebd5\") " Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.289205 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-scripts" (OuterVolumeSpecName: "scripts") pod "de233f0c-46d3-491e-83e6-b2334ba4ebd5" (UID: "de233f0c-46d3-491e-83e6-b2334ba4ebd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.295430 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de233f0c-46d3-491e-83e6-b2334ba4ebd5-kube-api-access-kw2nb" (OuterVolumeSpecName: "kube-api-access-kw2nb") pod "de233f0c-46d3-491e-83e6-b2334ba4ebd5" (UID: "de233f0c-46d3-491e-83e6-b2334ba4ebd5"). InnerVolumeSpecName "kube-api-access-kw2nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.314640 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de233f0c-46d3-491e-83e6-b2334ba4ebd5" (UID: "de233f0c-46d3-491e-83e6-b2334ba4ebd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.316360 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-config-data" (OuterVolumeSpecName: "config-data") pod "de233f0c-46d3-491e-83e6-b2334ba4ebd5" (UID: "de233f0c-46d3-491e-83e6-b2334ba4ebd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.384811 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.384847 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.384858 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de233f0c-46d3-491e-83e6-b2334ba4ebd5-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.384867 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw2nb\" (UniqueName: \"kubernetes.io/projected/de233f0c-46d3-491e-83e6-b2334ba4ebd5-kube-api-access-kw2nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.820566 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6v4fx" event={"ID":"de233f0c-46d3-491e-83e6-b2334ba4ebd5","Type":"ContainerDied","Data":"45269a2a42b75fd6148948b13b7ac455a534c09e6960f0b919295836f9320b00"} Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.821588 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45269a2a42b75fd6148948b13b7ac455a534c09e6960f0b919295836f9320b00" Oct 08 22:46:02 crc kubenswrapper[4834]: I1008 22:46:02.821601 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6v4fx" Oct 08 22:46:03 crc kubenswrapper[4834]: I1008 22:46:03.010130 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:46:03 crc kubenswrapper[4834]: I1008 22:46:03.010546 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e6806419-9992-4f1f-9bbc-36a0d52340ca" containerName="nova-api-log" containerID="cri-o://ffff8b102cbea74492faff4cfca190c1120b7bf0f34305db2f2ebb11b62d6d87" gracePeriod=30 Oct 08 22:46:03 crc kubenswrapper[4834]: I1008 22:46:03.010669 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e6806419-9992-4f1f-9bbc-36a0d52340ca" containerName="nova-api-api" containerID="cri-o://1229c847ebd47dc74a7d80badde533a7cbbcc001fda7ef8707bb039596a86a5d" gracePeriod=30 Oct 08 22:46:03 crc kubenswrapper[4834]: I1008 22:46:03.028133 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:46:03 crc kubenswrapper[4834]: I1008 22:46:03.028589 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c2a0af76-da3c-4c64-a3d2-12470eb473a5" containerName="nova-scheduler-scheduler" containerID="cri-o://91c7c6ed9c5bfb8ed570f28276c62ad0cbb8c13ada235547ff112ef99e1314d3" gracePeriod=30 Oct 08 22:46:03 crc kubenswrapper[4834]: I1008 22:46:03.091136 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:46:03 crc kubenswrapper[4834]: I1008 22:46:03.091551 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e088e5ae-16c2-4a56-a140-3159b429ad55" containerName="nova-metadata-metadata" containerID="cri-o://034f1ea19ec6256dbb510bd7df11b90a7fe2bc9ae6e6370fafd533fe33931e3e" gracePeriod=30 Oct 08 22:46:03 crc kubenswrapper[4834]: I1008 22:46:03.091392 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e088e5ae-16c2-4a56-a140-3159b429ad55" containerName="nova-metadata-log" containerID="cri-o://3bd6f7e16a00ae83612e785a5ba08b3c244e2b13197d441be36d452506a7314b" gracePeriod=30 Oct 08 22:46:03 crc kubenswrapper[4834]: I1008 22:46:03.835959 4834 generic.go:334] "Generic (PLEG): container finished" podID="e6806419-9992-4f1f-9bbc-36a0d52340ca" containerID="ffff8b102cbea74492faff4cfca190c1120b7bf0f34305db2f2ebb11b62d6d87" exitCode=143 Oct 08 22:46:03 crc kubenswrapper[4834]: I1008 22:46:03.836038 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6806419-9992-4f1f-9bbc-36a0d52340ca","Type":"ContainerDied","Data":"ffff8b102cbea74492faff4cfca190c1120b7bf0f34305db2f2ebb11b62d6d87"} Oct 08 22:46:03 crc kubenswrapper[4834]: I1008 22:46:03.839465 4834 generic.go:334] "Generic (PLEG): container finished" podID="e088e5ae-16c2-4a56-a140-3159b429ad55" containerID="3bd6f7e16a00ae83612e785a5ba08b3c244e2b13197d441be36d452506a7314b" exitCode=143 Oct 08 22:46:03 crc kubenswrapper[4834]: I1008 22:46:03.839502 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e088e5ae-16c2-4a56-a140-3159b429ad55","Type":"ContainerDied","Data":"3bd6f7e16a00ae83612e785a5ba08b3c244e2b13197d441be36d452506a7314b"} Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.610372 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.728064 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v95r\" (UniqueName: \"kubernetes.io/projected/c2a0af76-da3c-4c64-a3d2-12470eb473a5-kube-api-access-7v95r\") pod \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\" (UID: \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\") " Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.728472 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a0af76-da3c-4c64-a3d2-12470eb473a5-config-data\") pod \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\" (UID: \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\") " Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.728611 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a0af76-da3c-4c64-a3d2-12470eb473a5-combined-ca-bundle\") pod \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\" (UID: \"c2a0af76-da3c-4c64-a3d2-12470eb473a5\") " Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.736500 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a0af76-da3c-4c64-a3d2-12470eb473a5-kube-api-access-7v95r" (OuterVolumeSpecName: "kube-api-access-7v95r") pod "c2a0af76-da3c-4c64-a3d2-12470eb473a5" (UID: "c2a0af76-da3c-4c64-a3d2-12470eb473a5"). InnerVolumeSpecName "kube-api-access-7v95r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.781269 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a0af76-da3c-4c64-a3d2-12470eb473a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2a0af76-da3c-4c64-a3d2-12470eb473a5" (UID: "c2a0af76-da3c-4c64-a3d2-12470eb473a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.798022 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a0af76-da3c-4c64-a3d2-12470eb473a5-config-data" (OuterVolumeSpecName: "config-data") pod "c2a0af76-da3c-4c64-a3d2-12470eb473a5" (UID: "c2a0af76-da3c-4c64-a3d2-12470eb473a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.830523 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a0af76-da3c-4c64-a3d2-12470eb473a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.830574 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v95r\" (UniqueName: \"kubernetes.io/projected/c2a0af76-da3c-4c64-a3d2-12470eb473a5-kube-api-access-7v95r\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.830588 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a0af76-da3c-4c64-a3d2-12470eb473a5-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.851559 4834 generic.go:334] "Generic (PLEG): container finished" podID="c2a0af76-da3c-4c64-a3d2-12470eb473a5" containerID="91c7c6ed9c5bfb8ed570f28276c62ad0cbb8c13ada235547ff112ef99e1314d3" exitCode=0 Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.851600 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c2a0af76-da3c-4c64-a3d2-12470eb473a5","Type":"ContainerDied","Data":"91c7c6ed9c5bfb8ed570f28276c62ad0cbb8c13ada235547ff112ef99e1314d3"} Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.851624 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c2a0af76-da3c-4c64-a3d2-12470eb473a5","Type":"ContainerDied","Data":"8b2c7d5e0f34b533294f4a29ca31f296e3047ff84ee08a4176cbfba2c3a27dae"} Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.851641 4834 scope.go:117] "RemoveContainer" containerID="91c7c6ed9c5bfb8ed570f28276c62ad0cbb8c13ada235547ff112ef99e1314d3" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.851765 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.881225 4834 scope.go:117] "RemoveContainer" containerID="91c7c6ed9c5bfb8ed570f28276c62ad0cbb8c13ada235547ff112ef99e1314d3" Oct 08 22:46:04 crc kubenswrapper[4834]: E1008 22:46:04.881718 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91c7c6ed9c5bfb8ed570f28276c62ad0cbb8c13ada235547ff112ef99e1314d3\": container with ID starting with 91c7c6ed9c5bfb8ed570f28276c62ad0cbb8c13ada235547ff112ef99e1314d3 not found: ID does not exist" containerID="91c7c6ed9c5bfb8ed570f28276c62ad0cbb8c13ada235547ff112ef99e1314d3" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.881757 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91c7c6ed9c5bfb8ed570f28276c62ad0cbb8c13ada235547ff112ef99e1314d3"} err="failed to get container status \"91c7c6ed9c5bfb8ed570f28276c62ad0cbb8c13ada235547ff112ef99e1314d3\": rpc error: code = NotFound desc = could not find container \"91c7c6ed9c5bfb8ed570f28276c62ad0cbb8c13ada235547ff112ef99e1314d3\": container with ID starting with 91c7c6ed9c5bfb8ed570f28276c62ad0cbb8c13ada235547ff112ef99e1314d3 not found: ID does not exist" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.892299 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.902207 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.924760 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:46:04 crc kubenswrapper[4834]: E1008 22:46:04.925284 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de233f0c-46d3-491e-83e6-b2334ba4ebd5" containerName="nova-manage" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.925310 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="de233f0c-46d3-491e-83e6-b2334ba4ebd5" containerName="nova-manage" Oct 08 22:46:04 crc kubenswrapper[4834]: E1008 22:46:04.925336 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a0af76-da3c-4c64-a3d2-12470eb473a5" containerName="nova-scheduler-scheduler" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.925344 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a0af76-da3c-4c64-a3d2-12470eb473a5" containerName="nova-scheduler-scheduler" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.925557 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a0af76-da3c-4c64-a3d2-12470eb473a5" containerName="nova-scheduler-scheduler" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.925589 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="de233f0c-46d3-491e-83e6-b2334ba4ebd5" containerName="nova-manage" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.926545 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.928635 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 08 22:46:04 crc kubenswrapper[4834]: I1008 22:46:04.942349 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:46:05 crc kubenswrapper[4834]: I1008 22:46:05.032971 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n2r2\" (UniqueName: \"kubernetes.io/projected/f120f0d7-ba00-4502-a2f3-7c619440887a-kube-api-access-4n2r2\") pod \"nova-scheduler-0\" (UID: \"f120f0d7-ba00-4502-a2f3-7c619440887a\") " pod="openstack/nova-scheduler-0" Oct 08 22:46:05 crc kubenswrapper[4834]: I1008 22:46:05.033031 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f120f0d7-ba00-4502-a2f3-7c619440887a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f120f0d7-ba00-4502-a2f3-7c619440887a\") " pod="openstack/nova-scheduler-0" Oct 08 22:46:05 crc kubenswrapper[4834]: I1008 22:46:05.033240 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f120f0d7-ba00-4502-a2f3-7c619440887a-config-data\") pod \"nova-scheduler-0\" (UID: \"f120f0d7-ba00-4502-a2f3-7c619440887a\") " pod="openstack/nova-scheduler-0" Oct 08 22:46:05 crc kubenswrapper[4834]: I1008 22:46:05.135047 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n2r2\" (UniqueName: \"kubernetes.io/projected/f120f0d7-ba00-4502-a2f3-7c619440887a-kube-api-access-4n2r2\") pod \"nova-scheduler-0\" (UID: \"f120f0d7-ba00-4502-a2f3-7c619440887a\") " pod="openstack/nova-scheduler-0" Oct 08 22:46:05 crc kubenswrapper[4834]: I1008 22:46:05.135101 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f120f0d7-ba00-4502-a2f3-7c619440887a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f120f0d7-ba00-4502-a2f3-7c619440887a\") " pod="openstack/nova-scheduler-0" Oct 08 22:46:05 crc kubenswrapper[4834]: I1008 22:46:05.135157 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f120f0d7-ba00-4502-a2f3-7c619440887a-config-data\") pod \"nova-scheduler-0\" (UID: \"f120f0d7-ba00-4502-a2f3-7c619440887a\") " pod="openstack/nova-scheduler-0" Oct 08 22:46:05 crc kubenswrapper[4834]: I1008 22:46:05.142790 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f120f0d7-ba00-4502-a2f3-7c619440887a-config-data\") pod \"nova-scheduler-0\" (UID: \"f120f0d7-ba00-4502-a2f3-7c619440887a\") " pod="openstack/nova-scheduler-0" Oct 08 22:46:05 crc kubenswrapper[4834]: I1008 22:46:05.143333 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f120f0d7-ba00-4502-a2f3-7c619440887a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f120f0d7-ba00-4502-a2f3-7c619440887a\") " pod="openstack/nova-scheduler-0" Oct 08 22:46:05 crc kubenswrapper[4834]: I1008 22:46:05.154573 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n2r2\" (UniqueName: \"kubernetes.io/projected/f120f0d7-ba00-4502-a2f3-7c619440887a-kube-api-access-4n2r2\") pod \"nova-scheduler-0\" (UID: \"f120f0d7-ba00-4502-a2f3-7c619440887a\") " pod="openstack/nova-scheduler-0" Oct 08 22:46:05 crc kubenswrapper[4834]: I1008 22:46:05.248349 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:46:05 crc kubenswrapper[4834]: I1008 22:46:05.578371 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a0af76-da3c-4c64-a3d2-12470eb473a5" path="/var/lib/kubelet/pods/c2a0af76-da3c-4c64-a3d2-12470eb473a5/volumes" Oct 08 22:46:05 crc kubenswrapper[4834]: W1008 22:46:05.801474 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf120f0d7_ba00_4502_a2f3_7c619440887a.slice/crio-7689a28d2a430320b843a86f66429ec6661809e8039c4f8e35600b4b4812a94e WatchSource:0}: Error finding container 7689a28d2a430320b843a86f66429ec6661809e8039c4f8e35600b4b4812a94e: Status 404 returned error can't find the container with id 7689a28d2a430320b843a86f66429ec6661809e8039c4f8e35600b4b4812a94e Oct 08 22:46:05 crc kubenswrapper[4834]: I1008 22:46:05.805802 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:46:05 crc kubenswrapper[4834]: I1008 22:46:05.874801 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f120f0d7-ba00-4502-a2f3-7c619440887a","Type":"ContainerStarted","Data":"7689a28d2a430320b843a86f66429ec6661809e8039c4f8e35600b4b4812a94e"} Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.262162 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e088e5ae-16c2-4a56-a140-3159b429ad55" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:51586->10.217.0.192:8775: read: connection reset by peer" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.262183 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e088e5ae-16c2-4a56-a140-3159b429ad55" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:51574->10.217.0.192:8775: read: connection reset by peer" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.696786 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.779830 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-combined-ca-bundle\") pod \"e088e5ae-16c2-4a56-a140-3159b429ad55\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.779880 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e088e5ae-16c2-4a56-a140-3159b429ad55-logs\") pod \"e088e5ae-16c2-4a56-a140-3159b429ad55\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.779931 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-config-data\") pod \"e088e5ae-16c2-4a56-a140-3159b429ad55\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.779960 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-nova-metadata-tls-certs\") pod \"e088e5ae-16c2-4a56-a140-3159b429ad55\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.780057 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vnps\" (UniqueName: \"kubernetes.io/projected/e088e5ae-16c2-4a56-a140-3159b429ad55-kube-api-access-8vnps\") pod \"e088e5ae-16c2-4a56-a140-3159b429ad55\" (UID: \"e088e5ae-16c2-4a56-a140-3159b429ad55\") " Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.781752 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e088e5ae-16c2-4a56-a140-3159b429ad55-logs" (OuterVolumeSpecName: "logs") pod "e088e5ae-16c2-4a56-a140-3159b429ad55" (UID: "e088e5ae-16c2-4a56-a140-3159b429ad55"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.803732 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e088e5ae-16c2-4a56-a140-3159b429ad55-kube-api-access-8vnps" (OuterVolumeSpecName: "kube-api-access-8vnps") pod "e088e5ae-16c2-4a56-a140-3159b429ad55" (UID: "e088e5ae-16c2-4a56-a140-3159b429ad55"). InnerVolumeSpecName "kube-api-access-8vnps". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.811117 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e088e5ae-16c2-4a56-a140-3159b429ad55" (UID: "e088e5ae-16c2-4a56-a140-3159b429ad55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.811471 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-config-data" (OuterVolumeSpecName: "config-data") pod "e088e5ae-16c2-4a56-a140-3159b429ad55" (UID: "e088e5ae-16c2-4a56-a140-3159b429ad55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.839694 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e088e5ae-16c2-4a56-a140-3159b429ad55" (UID: "e088e5ae-16c2-4a56-a140-3159b429ad55"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.882121 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.882179 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e088e5ae-16c2-4a56-a140-3159b429ad55-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.882192 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.882204 4834 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e088e5ae-16c2-4a56-a140-3159b429ad55-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.882217 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vnps\" (UniqueName: \"kubernetes.io/projected/e088e5ae-16c2-4a56-a140-3159b429ad55-kube-api-access-8vnps\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.889193 4834 generic.go:334] "Generic (PLEG): container finished" podID="e088e5ae-16c2-4a56-a140-3159b429ad55" containerID="034f1ea19ec6256dbb510bd7df11b90a7fe2bc9ae6e6370fafd533fe33931e3e" exitCode=0 Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.889270 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e088e5ae-16c2-4a56-a140-3159b429ad55","Type":"ContainerDied","Data":"034f1ea19ec6256dbb510bd7df11b90a7fe2bc9ae6e6370fafd533fe33931e3e"} Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.889298 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e088e5ae-16c2-4a56-a140-3159b429ad55","Type":"ContainerDied","Data":"f026cd3653f38d7102fdd0f68278ebaf57e68c65a9a77ee53f1ec61c4455fd55"} Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.889336 4834 scope.go:117] "RemoveContainer" containerID="034f1ea19ec6256dbb510bd7df11b90a7fe2bc9ae6e6370fafd533fe33931e3e" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.889516 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.894904 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f120f0d7-ba00-4502-a2f3-7c619440887a","Type":"ContainerStarted","Data":"49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189"} Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.913544 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.913521259 podStartE2EDuration="2.913521259s" podCreationTimestamp="2025-10-08 22:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:46:06.908849975 +0000 UTC m=+1374.731734721" watchObservedRunningTime="2025-10-08 22:46:06.913521259 +0000 UTC m=+1374.736406005" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.933603 4834 scope.go:117] "RemoveContainer" containerID="3bd6f7e16a00ae83612e785a5ba08b3c244e2b13197d441be36d452506a7314b" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.933714 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.957287 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.958455 4834 scope.go:117] "RemoveContainer" containerID="034f1ea19ec6256dbb510bd7df11b90a7fe2bc9ae6e6370fafd533fe33931e3e" Oct 08 22:46:06 crc kubenswrapper[4834]: E1008 22:46:06.958854 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"034f1ea19ec6256dbb510bd7df11b90a7fe2bc9ae6e6370fafd533fe33931e3e\": container with ID starting with 034f1ea19ec6256dbb510bd7df11b90a7fe2bc9ae6e6370fafd533fe33931e3e not found: ID does not exist" containerID="034f1ea19ec6256dbb510bd7df11b90a7fe2bc9ae6e6370fafd533fe33931e3e" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.958906 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"034f1ea19ec6256dbb510bd7df11b90a7fe2bc9ae6e6370fafd533fe33931e3e"} err="failed to get container status \"034f1ea19ec6256dbb510bd7df11b90a7fe2bc9ae6e6370fafd533fe33931e3e\": rpc error: code = NotFound desc = could not find container \"034f1ea19ec6256dbb510bd7df11b90a7fe2bc9ae6e6370fafd533fe33931e3e\": container with ID starting with 034f1ea19ec6256dbb510bd7df11b90a7fe2bc9ae6e6370fafd533fe33931e3e not found: ID does not exist" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.958927 4834 scope.go:117] "RemoveContainer" containerID="3bd6f7e16a00ae83612e785a5ba08b3c244e2b13197d441be36d452506a7314b" Oct 08 22:46:06 crc kubenswrapper[4834]: E1008 22:46:06.959984 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bd6f7e16a00ae83612e785a5ba08b3c244e2b13197d441be36d452506a7314b\": container with ID starting with 3bd6f7e16a00ae83612e785a5ba08b3c244e2b13197d441be36d452506a7314b not found: ID does not exist" containerID="3bd6f7e16a00ae83612e785a5ba08b3c244e2b13197d441be36d452506a7314b" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.960006 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bd6f7e16a00ae83612e785a5ba08b3c244e2b13197d441be36d452506a7314b"} err="failed to get container status \"3bd6f7e16a00ae83612e785a5ba08b3c244e2b13197d441be36d452506a7314b\": rpc error: code = NotFound desc = could not find container \"3bd6f7e16a00ae83612e785a5ba08b3c244e2b13197d441be36d452506a7314b\": container with ID starting with 3bd6f7e16a00ae83612e785a5ba08b3c244e2b13197d441be36d452506a7314b not found: ID does not exist" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.976373 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:46:06 crc kubenswrapper[4834]: E1008 22:46:06.976841 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e088e5ae-16c2-4a56-a140-3159b429ad55" containerName="nova-metadata-metadata" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.976859 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e088e5ae-16c2-4a56-a140-3159b429ad55" containerName="nova-metadata-metadata" Oct 08 22:46:06 crc kubenswrapper[4834]: E1008 22:46:06.976880 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e088e5ae-16c2-4a56-a140-3159b429ad55" containerName="nova-metadata-log" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.976887 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e088e5ae-16c2-4a56-a140-3159b429ad55" containerName="nova-metadata-log" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.977052 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e088e5ae-16c2-4a56-a140-3159b429ad55" containerName="nova-metadata-metadata" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.977068 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e088e5ae-16c2-4a56-a140-3159b429ad55" containerName="nova-metadata-log" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.978038 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.983430 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.983765 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 08 22:46:06 crc kubenswrapper[4834]: I1008 22:46:06.987532 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.085058 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.085132 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-config-data\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.085282 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f052dbd-010a-456f-af57-0b6b2f6e70ad-logs\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.085318 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-258sm\" (UniqueName: \"kubernetes.io/projected/2f052dbd-010a-456f-af57-0b6b2f6e70ad-kube-api-access-258sm\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.085354 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.187353 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.187433 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-config-data\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.187542 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f052dbd-010a-456f-af57-0b6b2f6e70ad-logs\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.187578 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-258sm\" (UniqueName: \"kubernetes.io/projected/2f052dbd-010a-456f-af57-0b6b2f6e70ad-kube-api-access-258sm\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.187624 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.187976 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f052dbd-010a-456f-af57-0b6b2f6e70ad-logs\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.191571 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.194216 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-config-data\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.196694 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.219213 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-258sm\" (UniqueName: \"kubernetes.io/projected/2f052dbd-010a-456f-af57-0b6b2f6e70ad-kube-api-access-258sm\") pod \"nova-metadata-0\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.304358 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.580382 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e088e5ae-16c2-4a56-a140-3159b429ad55" path="/var/lib/kubelet/pods/e088e5ae-16c2-4a56-a140-3159b429ad55/volumes" Oct 08 22:46:07 crc kubenswrapper[4834]: E1008 22:46:07.601602 4834 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6806419_9992_4f1f_9bbc_36a0d52340ca.slice/crio-1229c847ebd47dc74a7d80badde533a7cbbcc001fda7ef8707bb039596a86a5d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6806419_9992_4f1f_9bbc_36a0d52340ca.slice/crio-conmon-1229c847ebd47dc74a7d80badde533a7cbbcc001fda7ef8707bb039596a86a5d.scope\": RecentStats: unable to find data in memory cache]" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.798895 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.906369 4834 generic.go:334] "Generic (PLEG): container finished" podID="e6806419-9992-4f1f-9bbc-36a0d52340ca" containerID="1229c847ebd47dc74a7d80badde533a7cbbcc001fda7ef8707bb039596a86a5d" exitCode=0 Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.906456 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6806419-9992-4f1f-9bbc-36a0d52340ca","Type":"ContainerDied","Data":"1229c847ebd47dc74a7d80badde533a7cbbcc001fda7ef8707bb039596a86a5d"} Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.906501 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e6806419-9992-4f1f-9bbc-36a0d52340ca","Type":"ContainerDied","Data":"d608cc19d0d0fbfdd7408daa22ba79c5f50ab3be551d097b2fa89a4d43300ba2"} Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.906526 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d608cc19d0d0fbfdd7408daa22ba79c5f50ab3be551d097b2fa89a4d43300ba2" Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.909957 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f052dbd-010a-456f-af57-0b6b2f6e70ad","Type":"ContainerStarted","Data":"7acb661a811c85fadcd6283573fe932fc01bc74c02d8f572f503b03d2d25bacb"} Oct 08 22:46:07 crc kubenswrapper[4834]: I1008 22:46:07.940551 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.110941 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6806419-9992-4f1f-9bbc-36a0d52340ca-logs\") pod \"e6806419-9992-4f1f-9bbc-36a0d52340ca\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.111021 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-combined-ca-bundle\") pod \"e6806419-9992-4f1f-9bbc-36a0d52340ca\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.111039 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-internal-tls-certs\") pod \"e6806419-9992-4f1f-9bbc-36a0d52340ca\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.111152 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-config-data\") pod \"e6806419-9992-4f1f-9bbc-36a0d52340ca\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.111191 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-public-tls-certs\") pod \"e6806419-9992-4f1f-9bbc-36a0d52340ca\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.111218 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzz4h\" (UniqueName: \"kubernetes.io/projected/e6806419-9992-4f1f-9bbc-36a0d52340ca-kube-api-access-gzz4h\") pod \"e6806419-9992-4f1f-9bbc-36a0d52340ca\" (UID: \"e6806419-9992-4f1f-9bbc-36a0d52340ca\") " Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.111586 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6806419-9992-4f1f-9bbc-36a0d52340ca-logs" (OuterVolumeSpecName: "logs") pod "e6806419-9992-4f1f-9bbc-36a0d52340ca" (UID: "e6806419-9992-4f1f-9bbc-36a0d52340ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.111673 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6806419-9992-4f1f-9bbc-36a0d52340ca-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.116238 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6806419-9992-4f1f-9bbc-36a0d52340ca-kube-api-access-gzz4h" (OuterVolumeSpecName: "kube-api-access-gzz4h") pod "e6806419-9992-4f1f-9bbc-36a0d52340ca" (UID: "e6806419-9992-4f1f-9bbc-36a0d52340ca"). InnerVolumeSpecName "kube-api-access-gzz4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.135666 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6806419-9992-4f1f-9bbc-36a0d52340ca" (UID: "e6806419-9992-4f1f-9bbc-36a0d52340ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.149286 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-config-data" (OuterVolumeSpecName: "config-data") pod "e6806419-9992-4f1f-9bbc-36a0d52340ca" (UID: "e6806419-9992-4f1f-9bbc-36a0d52340ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.166360 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e6806419-9992-4f1f-9bbc-36a0d52340ca" (UID: "e6806419-9992-4f1f-9bbc-36a0d52340ca"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.200953 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e6806419-9992-4f1f-9bbc-36a0d52340ca" (UID: "e6806419-9992-4f1f-9bbc-36a0d52340ca"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.213519 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.213578 4834 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.213595 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzz4h\" (UniqueName: \"kubernetes.io/projected/e6806419-9992-4f1f-9bbc-36a0d52340ca-kube-api-access-gzz4h\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.213609 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.213620 4834 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6806419-9992-4f1f-9bbc-36a0d52340ca-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.924214 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.924260 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f052dbd-010a-456f-af57-0b6b2f6e70ad","Type":"ContainerStarted","Data":"076875605f67f5f773c3929bec2e085e03b401b24922943ccf94d778e73bbdde"} Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.924321 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f052dbd-010a-456f-af57-0b6b2f6e70ad","Type":"ContainerStarted","Data":"e5fbf3dfab52fddd6705504ae85baf057b103dea50affd654e156b76c58f623f"} Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.959629 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.959610556 podStartE2EDuration="2.959610556s" podCreationTimestamp="2025-10-08 22:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:46:08.945614715 +0000 UTC m=+1376.768499521" watchObservedRunningTime="2025-10-08 22:46:08.959610556 +0000 UTC m=+1376.782495292" Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.979121 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:46:08 crc kubenswrapper[4834]: I1008 22:46:08.990662 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.009939 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 08 22:46:09 crc kubenswrapper[4834]: E1008 22:46:09.010367 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6806419-9992-4f1f-9bbc-36a0d52340ca" containerName="nova-api-api" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.010391 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6806419-9992-4f1f-9bbc-36a0d52340ca" containerName="nova-api-api" Oct 08 22:46:09 crc kubenswrapper[4834]: E1008 22:46:09.010437 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6806419-9992-4f1f-9bbc-36a0d52340ca" containerName="nova-api-log" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.010445 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6806419-9992-4f1f-9bbc-36a0d52340ca" containerName="nova-api-log" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.010663 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6806419-9992-4f1f-9bbc-36a0d52340ca" containerName="nova-api-api" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.010690 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6806419-9992-4f1f-9bbc-36a0d52340ca" containerName="nova-api-log" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.012016 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.014268 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.014763 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.016010 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.028949 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.130567 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.130629 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-config-data\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.130776 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.130858 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-public-tls-certs\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.131029 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqhqk\" (UniqueName: \"kubernetes.io/projected/e4629ae3-d685-43c9-81fd-49e84abd427f-kube-api-access-fqhqk\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.131105 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4629ae3-d685-43c9-81fd-49e84abd427f-logs\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.305538 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-public-tls-certs\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.305661 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqhqk\" (UniqueName: \"kubernetes.io/projected/e4629ae3-d685-43c9-81fd-49e84abd427f-kube-api-access-fqhqk\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.305697 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4629ae3-d685-43c9-81fd-49e84abd427f-logs\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.305748 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.305803 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-config-data\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.305936 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.306907 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4629ae3-d685-43c9-81fd-49e84abd427f-logs\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.312007 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-public-tls-certs\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.313005 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.313465 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.316522 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-config-data\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.327009 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqhqk\" (UniqueName: \"kubernetes.io/projected/e4629ae3-d685-43c9-81fd-49e84abd427f-kube-api-access-fqhqk\") pod \"nova-api-0\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " pod="openstack/nova-api-0" Oct 08 22:46:09 crc kubenswrapper[4834]: I1008 22:46:09.353038 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:46:10 crc kubenswrapper[4834]: I1008 22:46:09.569009 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6806419-9992-4f1f-9bbc-36a0d52340ca" path="/var/lib/kubelet/pods/e6806419-9992-4f1f-9bbc-36a0d52340ca/volumes" Oct 08 22:46:10 crc kubenswrapper[4834]: I1008 22:46:10.249242 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 08 22:46:10 crc kubenswrapper[4834]: I1008 22:46:10.529489 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:46:10 crc kubenswrapper[4834]: W1008 22:46:10.539384 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4629ae3_d685_43c9_81fd_49e84abd427f.slice/crio-f3fc1ed14f7e07d05b56b95383b8614931412f15377cf0d2af9d197fc464ce56 WatchSource:0}: Error finding container f3fc1ed14f7e07d05b56b95383b8614931412f15377cf0d2af9d197fc464ce56: Status 404 returned error can't find the container with id f3fc1ed14f7e07d05b56b95383b8614931412f15377cf0d2af9d197fc464ce56 Oct 08 22:46:10 crc kubenswrapper[4834]: I1008 22:46:10.950886 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e4629ae3-d685-43c9-81fd-49e84abd427f","Type":"ContainerStarted","Data":"1aa647034ef8640a17c6a15eef2e518f048d1fa10c9a9aee36b78bdc01cd48b4"} Oct 08 22:46:10 crc kubenswrapper[4834]: I1008 22:46:10.951302 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e4629ae3-d685-43c9-81fd-49e84abd427f","Type":"ContainerStarted","Data":"f3fc1ed14f7e07d05b56b95383b8614931412f15377cf0d2af9d197fc464ce56"} Oct 08 22:46:11 crc kubenswrapper[4834]: I1008 22:46:11.967425 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e4629ae3-d685-43c9-81fd-49e84abd427f","Type":"ContainerStarted","Data":"91c252fcdb0a95093ccf0b43533d67a7d99adc2858eb55d2f60c85cf2cf2cdfb"} Oct 08 22:46:12 crc kubenswrapper[4834]: I1008 22:46:12.005004 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.002462311 podStartE2EDuration="4.002462311s" podCreationTimestamp="2025-10-08 22:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-08 22:46:11.994457346 +0000 UTC m=+1379.817342132" watchObservedRunningTime="2025-10-08 22:46:12.002462311 +0000 UTC m=+1379.825347097" Oct 08 22:46:12 crc kubenswrapper[4834]: I1008 22:46:12.304565 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 22:46:12 crc kubenswrapper[4834]: I1008 22:46:12.304709 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 08 22:46:15 crc kubenswrapper[4834]: I1008 22:46:15.249134 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 08 22:46:15 crc kubenswrapper[4834]: I1008 22:46:15.288872 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 08 22:46:16 crc kubenswrapper[4834]: I1008 22:46:16.059555 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 08 22:46:17 crc kubenswrapper[4834]: I1008 22:46:17.024938 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:46:17 crc kubenswrapper[4834]: I1008 22:46:17.024988 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:46:17 crc kubenswrapper[4834]: I1008 22:46:17.025047 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:46:17 crc kubenswrapper[4834]: I1008 22:46:17.025803 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6163dc66da07cee67fb6457237db1afec09f5bef4082ecfbaefff77cd8dc028c"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:46:17 crc kubenswrapper[4834]: I1008 22:46:17.025864 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://6163dc66da07cee67fb6457237db1afec09f5bef4082ecfbaefff77cd8dc028c" gracePeriod=600 Oct 08 22:46:17 crc kubenswrapper[4834]: I1008 22:46:17.305132 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 22:46:17 crc kubenswrapper[4834]: I1008 22:46:17.305567 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 08 22:46:18 crc kubenswrapper[4834]: I1008 22:46:18.045356 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="6163dc66da07cee67fb6457237db1afec09f5bef4082ecfbaefff77cd8dc028c" exitCode=0 Oct 08 22:46:18 crc kubenswrapper[4834]: I1008 22:46:18.045614 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"6163dc66da07cee67fb6457237db1afec09f5bef4082ecfbaefff77cd8dc028c"} Oct 08 22:46:18 crc kubenswrapper[4834]: I1008 22:46:18.045852 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f"} Oct 08 22:46:18 crc kubenswrapper[4834]: I1008 22:46:18.045893 4834 scope.go:117] "RemoveContainer" containerID="ae7fc299ba63d30578a076e14832e2ba3dd0a6f32f375b1c858285b17f026ca6" Oct 08 22:46:18 crc kubenswrapper[4834]: I1008 22:46:18.326414 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 22:46:18 crc kubenswrapper[4834]: I1008 22:46:18.326444 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 22:46:19 crc kubenswrapper[4834]: I1008 22:46:19.353378 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:46:19 crc kubenswrapper[4834]: I1008 22:46:19.353851 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 08 22:46:20 crc kubenswrapper[4834]: I1008 22:46:20.112469 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 08 22:46:20 crc kubenswrapper[4834]: I1008 22:46:20.370300 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e4629ae3-d685-43c9-81fd-49e84abd427f" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 22:46:20 crc kubenswrapper[4834]: I1008 22:46:20.370958 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e4629ae3-d685-43c9-81fd-49e84abd427f" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 08 22:46:27 crc kubenswrapper[4834]: I1008 22:46:27.312049 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 22:46:27 crc kubenswrapper[4834]: I1008 22:46:27.313714 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 08 22:46:27 crc kubenswrapper[4834]: I1008 22:46:27.321366 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 22:46:28 crc kubenswrapper[4834]: I1008 22:46:28.172806 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 08 22:46:29 crc kubenswrapper[4834]: I1008 22:46:29.368019 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 22:46:29 crc kubenswrapper[4834]: I1008 22:46:29.368752 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 22:46:29 crc kubenswrapper[4834]: I1008 22:46:29.371727 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 08 22:46:29 crc kubenswrapper[4834]: I1008 22:46:29.377275 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 22:46:30 crc kubenswrapper[4834]: I1008 22:46:30.187456 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 08 22:46:30 crc kubenswrapper[4834]: I1008 22:46:30.195630 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.328653 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.329495 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6" containerName="openstackclient" containerID="cri-o://048b028a05ed9d1e34b226eae2432e5d45037864ccaf2322ecdfa230f03f479c" gracePeriod=2 Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.379189 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.722336 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.722751 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" containerName="openstack-network-exporter" containerID="cri-o://28c0d0fa60a03d4e0502734c048a91f1f1485f354314d1d0e3b140ec8efc322e" gracePeriod=300 Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.779193 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron3518-account-delete-sdqt9"] Oct 08 22:46:47 crc kubenswrapper[4834]: E1008 22:46:47.779581 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6" containerName="openstackclient" Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.779599 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6" containerName="openstackclient" Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.779788 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6" containerName="openstackclient" Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.780441 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron3518-account-delete-sdqt9" Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.801432 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron3518-account-delete-sdqt9"] Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.824802 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.926347 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.926879 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="00e05134-e159-40fe-9c63-a0dc406c8dee" containerName="ovn-northd" containerID="cri-o://befa7386c77d07b3be61cbc85442566df26dcee9bc664cf8da1c08dd1f7c92d7" gracePeriod=30 Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.927391 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="00e05134-e159-40fe-9c63-a0dc406c8dee" containerName="openstack-network-exporter" containerID="cri-o://ee6702ece47fd3dad3f711249016d49520a142737dfe65f64635bcd1579089db" gracePeriod=30 Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.933863 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement3088-account-delete-f6znv"] Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.935010 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement3088-account-delete-f6znv" Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.965925 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement3088-account-delete-f6znv"] Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.973252 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npnmp\" (UniqueName: \"kubernetes.io/projected/f70f2f55-ae76-4f8a-95a4-49933695ff6b-kube-api-access-npnmp\") pod \"neutron3518-account-delete-sdqt9\" (UID: \"f70f2f55-ae76-4f8a-95a4-49933695ff6b\") " pod="openstack/neutron3518-account-delete-sdqt9" Oct 08 22:46:47 crc kubenswrapper[4834]: E1008 22:46:47.973539 4834 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 22:46:47 crc kubenswrapper[4834]: E1008 22:46:47.973592 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data podName:08a7721f-38a1-4a82-88ed-6f70290b5a6d nodeName:}" failed. No retries permitted until 2025-10-08 22:46:48.47357407 +0000 UTC m=+1416.296458816 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data") pod "rabbitmq-server-0" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d") : configmap "rabbitmq-config-data" not found Oct 08 22:46:47 crc kubenswrapper[4834]: I1008 22:46:47.998072 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-7bm6j"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.009725 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-bdkgz"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.020287 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-bdkgz"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.036206 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" containerName="ovsdbserver-nb" containerID="cri-o://77fd525062b3fc27651cefd56aca9e6e25cf804fe96249c8aafd5597340b6dbd" gracePeriod=300 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.076132 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npnmp\" (UniqueName: \"kubernetes.io/projected/f70f2f55-ae76-4f8a-95a4-49933695ff6b-kube-api-access-npnmp\") pod \"neutron3518-account-delete-sdqt9\" (UID: \"f70f2f55-ae76-4f8a-95a4-49933695ff6b\") " pod="openstack/neutron3518-account-delete-sdqt9" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.076261 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4kbd\" (UniqueName: \"kubernetes.io/projected/12663254-035f-4057-b178-2dc4d42db157-kube-api-access-n4kbd\") pod \"placement3088-account-delete-f6znv\" (UID: \"12663254-035f-4057-b178-2dc4d42db157\") " pod="openstack/placement3088-account-delete-f6znv" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.083597 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance9716-account-delete-6zvrv"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.085296 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance9716-account-delete-6zvrv" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.131336 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-jmkzz"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.137240 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npnmp\" (UniqueName: \"kubernetes.io/projected/f70f2f55-ae76-4f8a-95a4-49933695ff6b-kube-api-access-npnmp\") pod \"neutron3518-account-delete-sdqt9\" (UID: \"f70f2f55-ae76-4f8a-95a4-49933695ff6b\") " pod="openstack/neutron3518-account-delete-sdqt9" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.178454 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7p6c\" (UniqueName: \"kubernetes.io/projected/a20774f5-74f4-4f7f-9f33-b4b55585cb7d-kube-api-access-n7p6c\") pod \"glance9716-account-delete-6zvrv\" (UID: \"a20774f5-74f4-4f7f-9f33-b4b55585cb7d\") " pod="openstack/glance9716-account-delete-6zvrv" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.178638 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4kbd\" (UniqueName: \"kubernetes.io/projected/12663254-035f-4057-b178-2dc4d42db157-kube-api-access-n4kbd\") pod \"placement3088-account-delete-f6znv\" (UID: \"12663254-035f-4057-b178-2dc4d42db157\") " pod="openstack/placement3088-account-delete-f6znv" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.189208 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance9716-account-delete-6zvrv"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.224940 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-l24rp"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.225234 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-l24rp" podUID="9087d728-8ea1-4f0c-aff6-7dae2fd139ec" containerName="openstack-network-exporter" containerID="cri-o://cf8764a03b4bf1c3a07b250cdaacaf2edfad20da4aa53f6789acb7d9ee72de4d" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.241258 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4kbd\" (UniqueName: \"kubernetes.io/projected/12663254-035f-4057-b178-2dc4d42db157-kube-api-access-n4kbd\") pod \"placement3088-account-delete-f6znv\" (UID: \"12663254-035f-4057-b178-2dc4d42db157\") " pod="openstack/placement3088-account-delete-f6znv" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.264797 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-mhj54"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.281451 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7p6c\" (UniqueName: \"kubernetes.io/projected/a20774f5-74f4-4f7f-9f33-b4b55585cb7d-kube-api-access-n7p6c\") pod \"glance9716-account-delete-6zvrv\" (UID: \"a20774f5-74f4-4f7f-9f33-b4b55585cb7d\") " pod="openstack/glance9716-account-delete-6zvrv" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.291346 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement3088-account-delete-f6znv" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.301196 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-mhj54"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.310903 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7p6c\" (UniqueName: \"kubernetes.io/projected/a20774f5-74f4-4f7f-9f33-b4b55585cb7d-kube-api-access-n7p6c\") pod \"glance9716-account-delete-6zvrv\" (UID: \"a20774f5-74f4-4f7f-9f33-b4b55585cb7d\") " pod="openstack/glance9716-account-delete-6zvrv" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.315195 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:46:48 crc kubenswrapper[4834]: E1008 22:46:48.329278 4834 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-7bm6j" message="Exiting ovn-controller (1) " Oct 08 22:46:48 crc kubenswrapper[4834]: E1008 22:46:48.329315 4834 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-7bm6j" podUID="74f0068c-4e61-4079-9d62-b338472e817d" containerName="ovn-controller" containerID="cri-o://0684ef58f2b375d2dba89b1f3c7e9edf759cbd90a9d86397a6cbdc6c5ca25b8c" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.329343 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-7bm6j" podUID="74f0068c-4e61-4079-9d62-b338472e817d" containerName="ovn-controller" containerID="cri-o://0684ef58f2b375d2dba89b1f3c7e9edf759cbd90a9d86397a6cbdc6c5ca25b8c" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.336630 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5967cc9597-s2zsg"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.336913 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" podUID="596135ed-4d76-4dec-94bd-cf17dfbfe2d6" containerName="dnsmasq-dns" containerID="cri-o://50d402217bd3c2796ec12c38a36a0c886b0ae574eb8874d5e4be8b40f8de8693" gracePeriod=10 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.366368 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-dcw47"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.370896 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance9716-account-delete-6zvrv" Oct 08 22:46:48 crc kubenswrapper[4834]: E1008 22:46:48.394354 4834 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 22:46:48 crc kubenswrapper[4834]: E1008 22:46:48.394413 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data podName:9809d14f-10d2-479f-94d9-5b3ae7f49e7b nodeName:}" failed. No retries permitted until 2025-10-08 22:46:48.894393956 +0000 UTC m=+1416.717278702 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b") : configmap "rabbitmq-cell1-config-data" not found Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.400646 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-dcw47"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.404814 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron3518-account-delete-sdqt9" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.434077 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-449pd"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.447488 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-449pd"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.464867 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-l24rp_9087d728-8ea1-4f0c-aff6-7dae2fd139ec/openstack-network-exporter/0.log" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.464921 4834 generic.go:334] "Generic (PLEG): container finished" podID="9087d728-8ea1-4f0c-aff6-7dae2fd139ec" containerID="cf8764a03b4bf1c3a07b250cdaacaf2edfad20da4aa53f6789acb7d9ee72de4d" exitCode=2 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.465024 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l24rp" event={"ID":"9087d728-8ea1-4f0c-aff6-7dae2fd139ec","Type":"ContainerDied","Data":"cf8764a03b4bf1c3a07b250cdaacaf2edfad20da4aa53f6789acb7d9ee72de4d"} Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.469429 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell1c373-account-delete-cq7ht"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.470943 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1c373-account-delete-cq7ht" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.484267 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell1c373-account-delete-cq7ht"] Oct 08 22:46:48 crc kubenswrapper[4834]: E1008 22:46:48.492912 4834 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 22:46:48 crc kubenswrapper[4834]: E1008 22:46:48.492963 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data podName:08a7721f-38a1-4a82-88ed-6f70290b5a6d nodeName:}" failed. No retries permitted until 2025-10-08 22:46:49.492950585 +0000 UTC m=+1417.315835331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data") pod "rabbitmq-server-0" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d") : configmap "rabbitmq-config-data" not found Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.503874 4834 generic.go:334] "Generic (PLEG): container finished" podID="00e05134-e159-40fe-9c63-a0dc406c8dee" containerID="ee6702ece47fd3dad3f711249016d49520a142737dfe65f64635bcd1579089db" exitCode=2 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.503961 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00e05134-e159-40fe-9c63-a0dc406c8dee","Type":"ContainerDied","Data":"ee6702ece47fd3dad3f711249016d49520a142737dfe65f64635bcd1579089db"} Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.527419 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.528026 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="701b75e6-1acc-47d0-85de-2349a6345a3b" containerName="openstack-network-exporter" containerID="cri-o://e630c1079c525a16879ee972cc2c3b32e171adbd6c3917c5714b8770364bfc76" gracePeriod=300 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.594408 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dkcl\" (UniqueName: \"kubernetes.io/projected/b0f06117-94bf-4e56-b5f7-e83eda8ee811-kube-api-access-6dkcl\") pod \"novacell1c373-account-delete-cq7ht\" (UID: \"b0f06117-94bf-4e56-b5f7-e83eda8ee811\") " pod="openstack/novacell1c373-account-delete-cq7ht" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.596352 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-g2wdt"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.618817 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_46d2d9d8-6fb8-45a7-bcba-5d6121b26dda/ovsdbserver-nb/0.log" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.618862 4834 generic.go:334] "Generic (PLEG): container finished" podID="46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" containerID="28c0d0fa60a03d4e0502734c048a91f1f1485f354314d1d0e3b140ec8efc322e" exitCode=2 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.618878 4834 generic.go:334] "Generic (PLEG): container finished" podID="46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" containerID="77fd525062b3fc27651cefd56aca9e6e25cf804fe96249c8aafd5597340b6dbd" exitCode=143 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.619014 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-g2wdt"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.619041 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda","Type":"ContainerDied","Data":"28c0d0fa60a03d4e0502734c048a91f1f1485f354314d1d0e3b140ec8efc322e"} Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.619058 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda","Type":"ContainerDied","Data":"77fd525062b3fc27651cefd56aca9e6e25cf804fe96249c8aafd5597340b6dbd"} Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.644020 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapif0f4-account-delete-b5tm7"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.645262 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapif0f4-account-delete-b5tm7" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.664422 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapif0f4-account-delete-b5tm7"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.684917 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f6cc747c5-vzjm2"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.685334 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f6cc747c5-vzjm2" podUID="62795e13-2e9c-4656-ab88-8788e50d37c5" containerName="neutron-api" containerID="cri-o://0994fee875d4da6ff390c9c0551593b552bdf75310faf54c410141d37e0f066c" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.685728 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f6cc747c5-vzjm2" podUID="62795e13-2e9c-4656-ab88-8788e50d37c5" containerName="neutron-httpd" containerID="cri-o://4c640bd73e780e0d64b84db93f10b09fd43cf57588febe8f44765e1d6c226f04" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.699859 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dkcl\" (UniqueName: \"kubernetes.io/projected/b0f06117-94bf-4e56-b5f7-e83eda8ee811-kube-api-access-6dkcl\") pod \"novacell1c373-account-delete-cq7ht\" (UID: \"b0f06117-94bf-4e56-b5f7-e83eda8ee811\") " pod="openstack/novacell1c373-account-delete-cq7ht" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.712024 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-2cn89"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.718990 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dkcl\" (UniqueName: \"kubernetes.io/projected/b0f06117-94bf-4e56-b5f7-e83eda8ee811-kube-api-access-6dkcl\") pod \"novacell1c373-account-delete-cq7ht\" (UID: \"b0f06117-94bf-4e56-b5f7-e83eda8ee811\") " pod="openstack/novacell1c373-account-delete-cq7ht" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.742552 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-2cn89"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.766022 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="701b75e6-1acc-47d0-85de-2349a6345a3b" containerName="ovsdbserver-sb" containerID="cri-o://f5e73cfdd6ec9a2f76bfa0052b93e5d772c1194c5cd98642eb2f1e306db7ed10" gracePeriod=300 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.824892 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4zsb\" (UniqueName: \"kubernetes.io/projected/2851fb85-5e8a-46af-9cac-d4df0c5eb16a-kube-api-access-z4zsb\") pod \"novaapif0f4-account-delete-b5tm7\" (UID: \"2851fb85-5e8a-46af-9cac-d4df0c5eb16a\") " pod="openstack/novaapif0f4-account-delete-b5tm7" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.885604 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1c373-account-delete-cq7ht" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.904210 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.921305 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-55b44744c4-z2p4d"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.928735 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4zsb\" (UniqueName: \"kubernetes.io/projected/2851fb85-5e8a-46af-9cac-d4df0c5eb16a-kube-api-access-z4zsb\") pod \"novaapif0f4-account-delete-b5tm7\" (UID: \"2851fb85-5e8a-46af-9cac-d4df0c5eb16a\") " pod="openstack/novaapif0f4-account-delete-b5tm7" Oct 08 22:46:48 crc kubenswrapper[4834]: E1008 22:46:48.929108 4834 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 22:46:48 crc kubenswrapper[4834]: E1008 22:46:48.929241 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data podName:9809d14f-10d2-479f-94d9-5b3ae7f49e7b nodeName:}" failed. No retries permitted until 2025-10-08 22:46:49.929221267 +0000 UTC m=+1417.752106013 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b") : configmap "rabbitmq-cell1-config-data" not found Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.938503 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.943195 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-55b44744c4-z2p4d" podUID="6c5d01ab-b923-4829-9b10-6ad9010216eb" containerName="placement-log" containerID="cri-o://8572778b6d4762619545e6de6bc9dc967110b451eb1a7a8a406e0ababce1ccb3" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.943371 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="37143980-a3f8-4398-a1d7-0f8189fb5366" containerName="glance-log" containerID="cri-o://97de4609bf212569f354a8db48765bc289647b7d03ecd1100224cb7a89ad47c3" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.943428 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-55b44744c4-z2p4d" podUID="6c5d01ab-b923-4829-9b10-6ad9010216eb" containerName="placement-api" containerID="cri-o://1057d7b178b738e910bc1a7a9841be19ad4ff8504d2eb4e2c8e8ff4bf273e1f0" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.943615 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="37143980-a3f8-4398-a1d7-0f8189fb5366" containerName="glance-httpd" containerID="cri-o://c3f0ee497d77bf25a33bb1a3381c77e479b17a2eac69da1fa447bfb0e183a0e4" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.943990 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-server" containerID="cri-o://916771be477f120fd96b5a1f5443d68a2ee7c86b4f15262feb8c7da880418d12" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.943993 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-server" containerID="cri-o://06fd06ac9cbc7b6f70094375f8d6b9bf90833b141ddadf2cad95017b587d05a9" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.944036 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="swift-recon-cron" containerID="cri-o://3fd81fba6fef2c5e63023d49c353e82227e744ef65163a821e731e717fb6624a" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.944070 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="rsync" containerID="cri-o://fd0fe5fab5d6566b676108078b9bb8956f0d4000e18eeb98d65adfb995ed66db" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.944100 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-expirer" containerID="cri-o://cc2d791ec077a375cc5534fb4333d22ce2bfa09e913fe297560734c663737cc0" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.944107 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-updater" containerID="cri-o://f652c6fc5992ba73c5161286567da0fcc0cd2540cfc9653f9d3e00b5ca106caa" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.944135 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-updater" containerID="cri-o://d032b4a1902d533ed5cdb4bbf1a9f2c37094a94a35f3ce8f02ac5f212367ceee" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.944166 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-auditor" containerID="cri-o://6ce010fd5f1a7322ee19e0d85fe1859b18c483f1e6cee129d2324a72aca8c9ae" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.944187 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-auditor" containerID="cri-o://f098a2e0cb0624cb3fe14c1c933baa20366c8520017da6de05a7436114e9e875" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.944200 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-replicator" containerID="cri-o://2091dbf498187923ad07111022a998400d80f39064fb32a84e0b373a124b3d4d" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.944220 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-replicator" containerID="cri-o://3f5e1b6ecf4bdb7095f6e4f54237dd8d5e0fce913b21ceb7e2e9d3bbe4da4702" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.944230 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-server" containerID="cri-o://d8010142af61d9a4bcf191642a46ef203b560507fdc751d54f09b14df4705c8f" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.944261 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-reaper" containerID="cri-o://32976af02fd539929cf231bf96ee9923fbea70a134e86a660abf29e813f31e4c" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.944303 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-auditor" containerID="cri-o://5523f6432e877d33c1dd624b005e3a10c13f5c90d768a9ec5b33eb492f9cd80b" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.944336 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-replicator" containerID="cri-o://7514b6a8989d60bf273471f9730c93e60d545194b6408566793dc09f41f4ed2c" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.952744 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4zsb\" (UniqueName: \"kubernetes.io/projected/2851fb85-5e8a-46af-9cac-d4df0c5eb16a-kube-api-access-z4zsb\") pod \"novaapif0f4-account-delete-b5tm7\" (UID: \"2851fb85-5e8a-46af-9cac-d4df0c5eb16a\") " pod="openstack/novaapif0f4-account-delete-b5tm7" Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.974904 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.975230 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" containerName="probe" containerID="cri-o://16284da8c6f35291d6b17fb6dfd3cb470e7f9c4ec4f746f4bc0659d2fb85b5fb" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.975133 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" containerName="cinder-scheduler" containerID="cri-o://0caa48090b97f4cd0f143f8b3522146daa146a232e6629762e868998fa0cbab2" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.995183 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.995397 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f1c297e1-ec55-4113-a87d-7813a27c03d9" containerName="glance-log" containerID="cri-o://354d0197ed5738529f4ce14ae4d167d9d0a781f57eca46b2a301712e02875868" gracePeriod=30 Oct 08 22:46:48 crc kubenswrapper[4834]: I1008 22:46:48.995775 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f1c297e1-ec55-4113-a87d-7813a27c03d9" containerName="glance-httpd" containerID="cri-o://7af54c4f4905381853f354524b17a405dd2c9d5ab3d098a361ea339c61a15d5d" gracePeriod=30 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.007270 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" podUID="596135ed-4d76-4dec-94bd-cf17dfbfe2d6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.198:5353: connect: connection refused" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.009284 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6v4fx"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.013246 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapif0f4-account-delete-b5tm7" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.033125 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6v4fx"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.066376 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-rx46c"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.094156 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-rx46c"] Oct 08 22:46:49 crc kubenswrapper[4834]: E1008 22:46:49.186049 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0684ef58f2b375d2dba89b1f3c7e9edf759cbd90a9d86397a6cbdc6c5ca25b8c is running failed: container process not found" containerID="0684ef58f2b375d2dba89b1f3c7e9edf759cbd90a9d86397a6cbdc6c5ca25b8c" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 08 22:46:49 crc kubenswrapper[4834]: E1008 22:46:49.186335 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0684ef58f2b375d2dba89b1f3c7e9edf759cbd90a9d86397a6cbdc6c5ca25b8c is running failed: container process not found" containerID="0684ef58f2b375d2dba89b1f3c7e9edf759cbd90a9d86397a6cbdc6c5ca25b8c" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 08 22:46:49 crc kubenswrapper[4834]: E1008 22:46:49.187041 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0684ef58f2b375d2dba89b1f3c7e9edf759cbd90a9d86397a6cbdc6c5ca25b8c is running failed: container process not found" containerID="0684ef58f2b375d2dba89b1f3c7e9edf759cbd90a9d86397a6cbdc6c5ca25b8c" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 08 22:46:49 crc kubenswrapper[4834]: E1008 22:46:49.187064 4834 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0684ef58f2b375d2dba89b1f3c7e9edf759cbd90a9d86397a6cbdc6c5ca25b8c is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-7bm6j" podUID="74f0068c-4e61-4079-9d62-b338472e817d" containerName="ovn-controller" Oct 08 22:46:49 crc kubenswrapper[4834]: E1008 22:46:49.198628 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:46:49 crc kubenswrapper[4834]: E1008 22:46:49.199340 4834 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 08 22:46:49 crc kubenswrapper[4834]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 08 22:46:49 crc kubenswrapper[4834]: + source /usr/local/bin/container-scripts/functions Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNBridge=br-int Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNRemote=tcp:localhost:6642 Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNEncapType=geneve Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNAvailabilityZones= Oct 08 22:46:49 crc kubenswrapper[4834]: ++ EnableChassisAsGateway=true Oct 08 22:46:49 crc kubenswrapper[4834]: ++ PhysicalNetworks= Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNHostName= Oct 08 22:46:49 crc kubenswrapper[4834]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 08 22:46:49 crc kubenswrapper[4834]: ++ ovs_dir=/var/lib/openvswitch Oct 08 22:46:49 crc kubenswrapper[4834]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 08 22:46:49 crc kubenswrapper[4834]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 08 22:46:49 crc kubenswrapper[4834]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 22:46:49 crc kubenswrapper[4834]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 22:46:49 crc kubenswrapper[4834]: + sleep 0.5 Oct 08 22:46:49 crc kubenswrapper[4834]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 22:46:49 crc kubenswrapper[4834]: + cleanup_ovsdb_server_semaphore Oct 08 22:46:49 crc kubenswrapper[4834]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 22:46:49 crc kubenswrapper[4834]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 08 22:46:49 crc kubenswrapper[4834]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-jmkzz" message=< Oct 08 22:46:49 crc kubenswrapper[4834]: Exiting ovsdb-server (5) [ OK ] Oct 08 22:46:49 crc kubenswrapper[4834]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 08 22:46:49 crc kubenswrapper[4834]: + source /usr/local/bin/container-scripts/functions Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNBridge=br-int Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNRemote=tcp:localhost:6642 Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNEncapType=geneve Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNAvailabilityZones= Oct 08 22:46:49 crc kubenswrapper[4834]: ++ EnableChassisAsGateway=true Oct 08 22:46:49 crc kubenswrapper[4834]: ++ PhysicalNetworks= Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNHostName= Oct 08 22:46:49 crc kubenswrapper[4834]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 08 22:46:49 crc kubenswrapper[4834]: ++ ovs_dir=/var/lib/openvswitch Oct 08 22:46:49 crc kubenswrapper[4834]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 08 22:46:49 crc kubenswrapper[4834]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 08 22:46:49 crc kubenswrapper[4834]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 22:46:49 crc kubenswrapper[4834]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 22:46:49 crc kubenswrapper[4834]: + sleep 0.5 Oct 08 22:46:49 crc kubenswrapper[4834]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 22:46:49 crc kubenswrapper[4834]: + cleanup_ovsdb_server_semaphore Oct 08 22:46:49 crc kubenswrapper[4834]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 22:46:49 crc kubenswrapper[4834]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 08 22:46:49 crc kubenswrapper[4834]: > Oct 08 22:46:49 crc kubenswrapper[4834]: E1008 22:46:49.200247 4834 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 08 22:46:49 crc kubenswrapper[4834]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 08 22:46:49 crc kubenswrapper[4834]: + source /usr/local/bin/container-scripts/functions Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNBridge=br-int Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNRemote=tcp:localhost:6642 Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNEncapType=geneve Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNAvailabilityZones= Oct 08 22:46:49 crc kubenswrapper[4834]: ++ EnableChassisAsGateway=true Oct 08 22:46:49 crc kubenswrapper[4834]: ++ PhysicalNetworks= Oct 08 22:46:49 crc kubenswrapper[4834]: ++ OVNHostName= Oct 08 22:46:49 crc kubenswrapper[4834]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 08 22:46:49 crc kubenswrapper[4834]: ++ ovs_dir=/var/lib/openvswitch Oct 08 22:46:49 crc kubenswrapper[4834]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 08 22:46:49 crc kubenswrapper[4834]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 08 22:46:49 crc kubenswrapper[4834]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 22:46:49 crc kubenswrapper[4834]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 22:46:49 crc kubenswrapper[4834]: + sleep 0.5 Oct 08 22:46:49 crc kubenswrapper[4834]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 08 22:46:49 crc kubenswrapper[4834]: + cleanup_ovsdb_server_semaphore Oct 08 22:46:49 crc kubenswrapper[4834]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 08 22:46:49 crc kubenswrapper[4834]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 08 22:46:49 crc kubenswrapper[4834]: > pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovsdb-server" containerID="cri-o://a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.200752 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovsdb-server" containerID="cri-o://a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" gracePeriod=29 Oct 08 22:46:49 crc kubenswrapper[4834]: E1008 22:46:49.207435 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:46:49 crc kubenswrapper[4834]: E1008 22:46:49.209943 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:46:49 crc kubenswrapper[4834]: E1008 22:46:49.209988 4834 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovsdb-server" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.218161 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.218412 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2fede876-b44b-40e1-8c56-9c35d2528e37" containerName="cinder-api-log" containerID="cri-o://69bc22ed35e6a76565637096b43955534881f7b3f617bfa8208087f9e3cad9e3" gracePeriod=30 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.218695 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2fede876-b44b-40e1-8c56-9c35d2528e37" containerName="cinder-api" containerID="cri-o://577329ac5e86fe36588abe9f509038517d2bba0f083da6b616ddb28e28603822" gracePeriod=30 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.232693 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-3518-account-create-g28bg"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.241363 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron3518-account-delete-sdqt9"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.262934 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-zfddn"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.273368 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-zfddn"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.296289 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovs-vswitchd" containerID="cri-o://dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" gracePeriod=29 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.301424 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-3518-account-create-g28bg"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.331554 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.352842 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovs-vswitchd" probeResult="failure" output=< Oct 08 22:46:49 crc kubenswrapper[4834]: cat: /var/run/openvswitch/ovs-vswitchd.pid: No such file or directory Oct 08 22:46:49 crc kubenswrapper[4834]: ERROR - Failed to get pid for ovs-vswitchd, exit status: 0 Oct 08 22:46:49 crc kubenswrapper[4834]: > Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.367598 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-gzmg7"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.381215 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-gzmg7"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.388201 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3088-account-create-r9mqq"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.442545 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3088-account-create-r9mqq"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.489367 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement3088-account-delete-f6znv"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.514816 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-f9gml"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.526923 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-f9gml"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.533830 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-pqv9l"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.537398 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="08a7721f-38a1-4a82-88ed-6f70290b5a6d" containerName="rabbitmq" containerID="cri-o://d45fc994c0ded46eacd3c34ef9d7bd611a887f7680ed3782c64c1af97de14c9b" gracePeriod=604800 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.541130 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6272-account-create-dszws"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.546801 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-pqv9l"] Oct 08 22:46:49 crc kubenswrapper[4834]: E1008 22:46:49.559810 4834 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 22:46:49 crc kubenswrapper[4834]: E1008 22:46:49.559880 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data podName:08a7721f-38a1-4a82-88ed-6f70290b5a6d nodeName:}" failed. No retries permitted until 2025-10-08 22:46:51.559862401 +0000 UTC m=+1419.382747147 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data") pod "rabbitmq-server-0" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d") : configmap "rabbitmq-config-data" not found Oct 08 22:46:49 crc kubenswrapper[4834]: E1008 22:46:49.610689 4834 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod701b75e6_1acc_47d0_85de_2349a6345a3b.slice/crio-conmon-e630c1079c525a16879ee972cc2c3b32e171adbd6c3917c5714b8770364bfc76.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fede876_b44b_40e1_8c56_9c35d2528e37.slice/crio-69bc22ed35e6a76565637096b43955534881f7b3f617bfa8208087f9e3cad9e3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77e43c87_585d_4d7c_bd16_ab66b531e024.slice/crio-conmon-a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ab95611_95ff_46bf_9b06_2ed44a58fa46.slice/crio-conmon-5523f6432e877d33c1dd624b005e3a10c13f5c90d768a9ec5b33eb492f9cd80b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ab95611_95ff_46bf_9b06_2ed44a58fa46.slice/crio-6ce010fd5f1a7322ee19e0d85fe1859b18c483f1e6cee129d2324a72aca8c9ae.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62795e13_2e9c_4656_ab88_8788e50d37c5.slice/crio-conmon-4c640bd73e780e0d64b84db93f10b09fd43cf57588febe8f44765e1d6c226f04.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ab95611_95ff_46bf_9b06_2ed44a58fa46.slice/crio-3f5e1b6ecf4bdb7095f6e4f54237dd8d5e0fce913b21ceb7e2e9d3bbe4da4702.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod701b75e6_1acc_47d0_85de_2349a6345a3b.slice/crio-f5e73cfdd6ec9a2f76bfa0052b93e5d772c1194c5cd98642eb2f1e306db7ed10.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ab95611_95ff_46bf_9b06_2ed44a58fa46.slice/crio-f098a2e0cb0624cb3fe14c1c933baa20366c8520017da6de05a7436114e9e875.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ab95611_95ff_46bf_9b06_2ed44a58fa46.slice/crio-conmon-32976af02fd539929cf231bf96ee9923fbea70a134e86a660abf29e813f31e4c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c5d01ab_b923_4829_9b10_6ad9010216eb.slice/crio-8572778b6d4762619545e6de6bc9dc967110b451eb1a7a8a406e0ababce1ccb3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ab95611_95ff_46bf_9b06_2ed44a58fa46.slice/crio-conmon-cc2d791ec077a375cc5534fb4333d22ce2bfa09e913fe297560734c663737cc0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod701b75e6_1acc_47d0_85de_2349a6345a3b.slice/crio-e630c1079c525a16879ee972cc2c3b32e171adbd6c3917c5714b8770364bfc76.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ab95611_95ff_46bf_9b06_2ed44a58fa46.slice/crio-cc2d791ec077a375cc5534fb4333d22ce2bfa09e913fe297560734c663737cc0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8b7117c_11cc_4ba9_bd98_e25e6a56d8a6.slice/crio-048b028a05ed9d1e34b226eae2432e5d45037864ccaf2322ecdfa230f03f479c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fede876_b44b_40e1_8c56_9c35d2528e37.slice/crio-conmon-69bc22ed35e6a76565637096b43955534881f7b3f617bfa8208087f9e3cad9e3.scope\": RecentStats: unable to find data in memory cache]" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.611429 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a24e03-f3be-433f-bbc1-3a25da713c65" path="/var/lib/kubelet/pods/15a24e03-f3be-433f-bbc1-3a25da713c65/volumes" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.622248 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="287d1d4a-d93e-4866-89c5-72b876734d9e" path="/var/lib/kubelet/pods/287d1d4a-d93e-4866-89c5-72b876734d9e/volumes" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.622951 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d656893-3446-42fe-86ad-74e1b9d7ecd5" path="/var/lib/kubelet/pods/3d656893-3446-42fe-86ad-74e1b9d7ecd5/volumes" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.623577 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d83d01f-362f-463f-b837-8d39418f3abf" path="/var/lib/kubelet/pods/4d83d01f-362f-463f-b837-8d39418f3abf/volumes" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.631889 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53c0210a-93d4-4f54-a542-d69c77229b9e" path="/var/lib/kubelet/pods/53c0210a-93d4-4f54-a542-d69c77229b9e/volumes" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.632502 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60bddd4c-dfc8-401c-8e6c-026cebd5703c" path="/var/lib/kubelet/pods/60bddd4c-dfc8-401c-8e6c-026cebd5703c/volumes" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.633077 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="910a7214-6f0f-452b-adc2-91d1c2589d47" path="/var/lib/kubelet/pods/910a7214-6f0f-452b-adc2-91d1c2589d47/volumes" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.673108 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-l24rp_9087d728-8ea1-4f0c-aff6-7dae2fd139ec/openstack-network-exporter/0.log" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.673192 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.675485 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9226bc4-6b16-45cd-a31a-163ad9b5aa53" path="/var/lib/kubelet/pods/b9226bc4-6b16-45cd-a31a-163ad9b5aa53/volumes" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.676892 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bedf43f2-17b7-462d-8ca1-41d4dae1e6cb" path="/var/lib/kubelet/pods/bedf43f2-17b7-462d-8ca1-41d4dae1e6cb/volumes" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.677751 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c708dd7c-f12f-49bf-a622-74b33227c62f" path="/var/lib/kubelet/pods/c708dd7c-f12f-49bf-a622-74b33227c62f/volumes" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.678502 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8ecacbd-3766-4d66-a888-a0bed940192d" path="/var/lib/kubelet/pods/c8ecacbd-3766-4d66-a888-a0bed940192d/volumes" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.714956 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4a0f324-ef5b-4320-b735-70a1f26376a0" path="/var/lib/kubelet/pods/d4a0f324-ef5b-4320-b735-70a1f26376a0/volumes" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.724581 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="fd0fe5fab5d6566b676108078b9bb8956f0d4000e18eeb98d65adfb995ed66db" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.724634 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="cc2d791ec077a375cc5534fb4333d22ce2bfa09e913fe297560734c663737cc0" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.724650 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="d032b4a1902d533ed5cdb4bbf1a9f2c37094a94a35f3ce8f02ac5f212367ceee" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.724659 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="f098a2e0cb0624cb3fe14c1c933baa20366c8520017da6de05a7436114e9e875" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.724677 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="3f5e1b6ecf4bdb7095f6e4f54237dd8d5e0fce913b21ceb7e2e9d3bbe4da4702" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.724686 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="f652c6fc5992ba73c5161286567da0fcc0cd2540cfc9653f9d3e00b5ca106caa" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.724696 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="6ce010fd5f1a7322ee19e0d85fe1859b18c483f1e6cee129d2324a72aca8c9ae" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.724706 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="2091dbf498187923ad07111022a998400d80f39064fb32a84e0b373a124b3d4d" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.724716 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="32976af02fd539929cf231bf96ee9923fbea70a134e86a660abf29e813f31e4c" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.724727 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="5523f6432e877d33c1dd624b005e3a10c13f5c90d768a9ec5b33eb492f9cd80b" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.724736 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="7514b6a8989d60bf273471f9730c93e60d545194b6408566793dc09f41f4ed2c" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.728639 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de233f0c-46d3-491e-83e6-b2334ba4ebd5" path="/var/lib/kubelet/pods/de233f0c-46d3-491e-83e6-b2334ba4ebd5/volumes" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.728764 4834 generic.go:334] "Generic (PLEG): container finished" podID="37143980-a3f8-4398-a1d7-0f8189fb5366" containerID="97de4609bf212569f354a8db48765bc289647b7d03ecd1100224cb7a89ad47c3" exitCode=143 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.748874 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0428da9-4f94-4297-b639-c8b777b1d216" path="/var/lib/kubelet/pods/e0428da9-4f94-4297-b639-c8b777b1d216/volumes" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749621 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6272-account-create-dszws"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749646 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-57d9-account-create-hm5jw"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749658 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"fd0fe5fab5d6566b676108078b9bb8956f0d4000e18eeb98d65adfb995ed66db"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749686 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-57d9-account-create-hm5jw"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749702 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"cc2d791ec077a375cc5534fb4333d22ce2bfa09e913fe297560734c663737cc0"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749715 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-r88vk"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749727 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance9716-account-delete-6zvrv"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749740 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-r88vk"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749751 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"d032b4a1902d533ed5cdb4bbf1a9f2c37094a94a35f3ce8f02ac5f212367ceee"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749760 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9716-account-create-cjmxg"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749773 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"f098a2e0cb0624cb3fe14c1c933baa20366c8520017da6de05a7436114e9e875"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749784 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"3f5e1b6ecf4bdb7095f6e4f54237dd8d5e0fce913b21ceb7e2e9d3bbe4da4702"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749799 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"f652c6fc5992ba73c5161286567da0fcc0cd2540cfc9653f9d3e00b5ca106caa"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749814 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9716-account-create-cjmxg"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749824 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"6ce010fd5f1a7322ee19e0d85fe1859b18c483f1e6cee129d2324a72aca8c9ae"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749833 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"2091dbf498187923ad07111022a998400d80f39064fb32a84e0b373a124b3d4d"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749841 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"32976af02fd539929cf231bf96ee9923fbea70a134e86a660abf29e813f31e4c"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749850 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"5523f6432e877d33c1dd624b005e3a10c13f5c90d768a9ec5b33eb492f9cd80b"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749860 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"7514b6a8989d60bf273471f9730c93e60d545194b6408566793dc09f41f4ed2c"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.749871 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37143980-a3f8-4398-a1d7-0f8189fb5366","Type":"ContainerDied","Data":"97de4609bf212569f354a8db48765bc289647b7d03ecd1100224cb7a89ad47c3"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.756481 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.765242 4834 generic.go:334] "Generic (PLEG): container finished" podID="62795e13-2e9c-4656-ab88-8788e50d37c5" containerID="4c640bd73e780e0d64b84db93f10b09fd43cf57588febe8f44765e1d6c226f04" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.765324 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f6cc747c5-vzjm2" event={"ID":"62795e13-2e9c-4656-ab88-8788e50d37c5","Type":"ContainerDied","Data":"4c640bd73e780e0d64b84db93f10b09fd43cf57588febe8f44765e1d6c226f04"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.767506 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.767695 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" containerName="nova-metadata-log" containerID="cri-o://e5fbf3dfab52fddd6705504ae85baf057b103dea50affd654e156b76c58f623f" gracePeriod=30 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.767943 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" containerName="nova-metadata-metadata" containerID="cri-o://076875605f67f5f773c3929bec2e085e03b401b24922943ccf94d778e73bbdde" gracePeriod=30 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.770681 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-config\") pod \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.770788 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-ovs-rundir\") pod \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.770850 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-metrics-certs-tls-certs\") pod \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.770942 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vntm4\" (UniqueName: \"kubernetes.io/projected/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-kube-api-access-vntm4\") pod \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.771017 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-ovn-rundir\") pod \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.771057 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-combined-ca-bundle\") pod \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\" (UID: \"9087d728-8ea1-4f0c-aff6-7dae2fd139ec\") " Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.773335 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "9087d728-8ea1-4f0c-aff6-7dae2fd139ec" (UID: "9087d728-8ea1-4f0c-aff6-7dae2fd139ec"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.773372 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "9087d728-8ea1-4f0c-aff6-7dae2fd139ec" (UID: "9087d728-8ea1-4f0c-aff6-7dae2fd139ec"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.773416 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-config" (OuterVolumeSpecName: "config") pod "9087d728-8ea1-4f0c-aff6-7dae2fd139ec" (UID: "9087d728-8ea1-4f0c-aff6-7dae2fd139ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.775725 4834 generic.go:334] "Generic (PLEG): container finished" podID="596135ed-4d76-4dec-94bd-cf17dfbfe2d6" containerID="50d402217bd3c2796ec12c38a36a0c886b0ae574eb8874d5e4be8b40f8de8693" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.776192 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" event={"ID":"596135ed-4d76-4dec-94bd-cf17dfbfe2d6","Type":"ContainerDied","Data":"50d402217bd3c2796ec12c38a36a0c886b0ae574eb8874d5e4be8b40f8de8693"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.779999 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-kube-api-access-vntm4" (OuterVolumeSpecName: "kube-api-access-vntm4") pod "9087d728-8ea1-4f0c-aff6-7dae2fd139ec" (UID: "9087d728-8ea1-4f0c-aff6-7dae2fd139ec"). InnerVolumeSpecName "kube-api-access-vntm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.780315 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7bm6j" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.785421 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7779b9cfc5-lq477"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.785730 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7779b9cfc5-lq477" podUID="2f7d4f35-145c-4af9-9f4b-de8700877370" containerName="barbican-worker-log" containerID="cri-o://49a4837887871793d5bd61522d4486481e468e9ccbe44afb693d4cf994046387" gracePeriod=30 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.786929 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7779b9cfc5-lq477" podUID="2f7d4f35-145c-4af9-9f4b-de8700877370" containerName="barbican-worker" containerID="cri-o://91bc866a40d8cd6ae97cd0196df92bd74f8bc6d69ab347d97b4b45e357486e30" gracePeriod=30 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.793422 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.793680 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e4629ae3-d685-43c9-81fd-49e84abd427f" containerName="nova-api-log" containerID="cri-o://1aa647034ef8640a17c6a15eef2e518f048d1fa10c9a9aee36b78bdc01cd48b4" gracePeriod=30 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.793824 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e4629ae3-d685-43c9-81fd-49e84abd427f" containerName="nova-api-api" containerID="cri-o://91c252fcdb0a95093ccf0b43533d67a7d99adc2858eb55d2f60c85cf2cf2cdfb" gracePeriod=30 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.794334 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_701b75e6-1acc-47d0-85de-2349a6345a3b/ovsdbserver-sb/0.log" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.794363 4834 generic.go:334] "Generic (PLEG): container finished" podID="701b75e6-1acc-47d0-85de-2349a6345a3b" containerID="e630c1079c525a16879ee972cc2c3b32e171adbd6c3917c5714b8770364bfc76" exitCode=2 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.794374 4834 generic.go:334] "Generic (PLEG): container finished" podID="701b75e6-1acc-47d0-85de-2349a6345a3b" containerID="f5e73cfdd6ec9a2f76bfa0052b93e5d772c1194c5cd98642eb2f1e306db7ed10" exitCode=143 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.794406 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"701b75e6-1acc-47d0-85de-2349a6345a3b","Type":"ContainerDied","Data":"e630c1079c525a16879ee972cc2c3b32e171adbd6c3917c5714b8770364bfc76"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.794422 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"701b75e6-1acc-47d0-85de-2349a6345a3b","Type":"ContainerDied","Data":"f5e73cfdd6ec9a2f76bfa0052b93e5d772c1194c5cd98642eb2f1e306db7ed10"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.805393 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-688bb4b854-srcv6"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.805660 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" podUID="6122ff69-d6fb-4002-8679-80b826faf58f" containerName="barbican-keystone-listener-log" containerID="cri-o://e0129efbb3686c4ac85c467d7e795d114a6de5a26a31044df0216a19b3300c6a" gracePeriod=30 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.805774 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" podUID="6122ff69-d6fb-4002-8679-80b826faf58f" containerName="barbican-keystone-listener" containerID="cri-o://f4112eae344e33c2ed1e518a1ec200f2e5a71736e028860090c20eae2cea0902" gracePeriod=30 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.815480 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9087d728-8ea1-4f0c-aff6-7dae2fd139ec" (UID: "9087d728-8ea1-4f0c-aff6-7dae2fd139ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.838905 4834 generic.go:334] "Generic (PLEG): container finished" podID="2fede876-b44b-40e1-8c56-9c35d2528e37" containerID="69bc22ed35e6a76565637096b43955534881f7b3f617bfa8208087f9e3cad9e3" exitCode=143 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.838978 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fede876-b44b-40e1-8c56-9c35d2528e37","Type":"ContainerDied","Data":"69bc22ed35e6a76565637096b43955534881f7b3f617bfa8208087f9e3cad9e3"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.855467 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-rgzpk"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.859234 4834 generic.go:334] "Generic (PLEG): container finished" podID="6c5d01ab-b923-4829-9b10-6ad9010216eb" containerID="8572778b6d4762619545e6de6bc9dc967110b451eb1a7a8a406e0ababce1ccb3" exitCode=143 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.859310 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55b44744c4-z2p4d" event={"ID":"6c5d01ab-b923-4829-9b10-6ad9010216eb","Type":"ContainerDied","Data":"8572778b6d4762619545e6de6bc9dc967110b451eb1a7a8a406e0ababce1ccb3"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.885992 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-run-ovn\") pod \"74f0068c-4e61-4079-9d62-b338472e817d\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.886332 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f0068c-4e61-4079-9d62-b338472e817d-combined-ca-bundle\") pod \"74f0068c-4e61-4079-9d62-b338472e817d\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.886416 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6rtf\" (UniqueName: \"kubernetes.io/projected/74f0068c-4e61-4079-9d62-b338472e817d-kube-api-access-d6rtf\") pod \"74f0068c-4e61-4079-9d62-b338472e817d\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.886435 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-log-ovn\") pod \"74f0068c-4e61-4079-9d62-b338472e817d\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.886506 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-run\") pod \"74f0068c-4e61-4079-9d62-b338472e817d\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.886550 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/74f0068c-4e61-4079-9d62-b338472e817d-ovn-controller-tls-certs\") pod \"74f0068c-4e61-4079-9d62-b338472e817d\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.886578 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74f0068c-4e61-4079-9d62-b338472e817d-scripts\") pod \"74f0068c-4e61-4079-9d62-b338472e817d\" (UID: \"74f0068c-4e61-4079-9d62-b338472e817d\") " Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.889470 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-run" (OuterVolumeSpecName: "var-run") pod "74f0068c-4e61-4079-9d62-b338472e817d" (UID: "74f0068c-4e61-4079-9d62-b338472e817d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.889573 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "74f0068c-4e61-4079-9d62-b338472e817d" (UID: "74f0068c-4e61-4079-9d62-b338472e817d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.889602 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "74f0068c-4e61-4079-9d62-b338472e817d" (UID: "74f0068c-4e61-4079-9d62-b338472e817d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.899367 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-rgzpk"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.900280 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f0068c-4e61-4079-9d62-b338472e817d-scripts" (OuterVolumeSpecName: "scripts") pod "74f0068c-4e61-4079-9d62-b338472e817d" (UID: "74f0068c-4e61-4079-9d62-b338472e817d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.900896 4834 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.900924 4834 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.900946 4834 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-run\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.900955 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74f0068c-4e61-4079-9d62-b338472e817d-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.900964 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vntm4\" (UniqueName: \"kubernetes.io/projected/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-kube-api-access-vntm4\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.900973 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.900980 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.900989 4834 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74f0068c-4e61-4079-9d62-b338472e817d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.900997 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.921493 4834 generic.go:334] "Generic (PLEG): container finished" podID="74f0068c-4e61-4079-9d62-b338472e817d" containerID="0684ef58f2b375d2dba89b1f3c7e9edf759cbd90a9d86397a6cbdc6c5ca25b8c" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.921650 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7bm6j" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.921962 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-9f9b7d4b4-cr99t"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.922005 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7bm6j" event={"ID":"74f0068c-4e61-4079-9d62-b338472e817d","Type":"ContainerDied","Data":"0684ef58f2b375d2dba89b1f3c7e9edf759cbd90a9d86397a6cbdc6c5ca25b8c"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.922032 4834 scope.go:117] "RemoveContainer" containerID="0684ef58f2b375d2dba89b1f3c7e9edf759cbd90a9d86397a6cbdc6c5ca25b8c" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.922367 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-9f9b7d4b4-cr99t" podUID="a163bab0-7bd2-4272-a1f0-cd0090eed141" containerName="barbican-api-log" containerID="cri-o://ee03e22162c15519f33753c495b20abc4d67e8ca443ee3f90594ba5404a7a3a2" gracePeriod=30 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.922510 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-9f9b7d4b4-cr99t" podUID="a163bab0-7bd2-4272-a1f0-cd0090eed141" containerName="barbican-api" containerID="cri-o://fd63b549a1986038c721e698592fdda290ebfb583fc9734ad9ac998ea85e14d3" gracePeriod=30 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.939372 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapif0f4-account-delete-b5tm7"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.942102 4834 generic.go:334] "Generic (PLEG): container finished" podID="f1c297e1-ec55-4113-a87d-7813a27c03d9" containerID="354d0197ed5738529f4ce14ae4d167d9d0a781f57eca46b2a301712e02875868" exitCode=143 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.942303 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f1c297e1-ec55-4113-a87d-7813a27c03d9","Type":"ContainerDied","Data":"354d0197ed5738529f4ce14ae4d167d9d0a781f57eca46b2a301712e02875868"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.954548 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-9f9b7d4b4-cr99t" podUID="a163bab0-7bd2-4272-a1f0-cd0090eed141" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:49612->10.217.0.162:9311: read: connection reset by peer" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.956626 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f0068c-4e61-4079-9d62-b338472e817d-kube-api-access-d6rtf" (OuterVolumeSpecName: "kube-api-access-d6rtf") pod "74f0068c-4e61-4079-9d62-b338472e817d" (UID: "74f0068c-4e61-4079-9d62-b338472e817d"). InnerVolumeSpecName "kube-api-access-d6rtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.957314 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2ad8-account-create-x9wt9"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.971848 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2ad8-account-create-x9wt9"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.974020 4834 generic.go:334] "Generic (PLEG): container finished" podID="b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6" containerID="048b028a05ed9d1e34b226eae2432e5d45037864ccaf2322ecdfa230f03f479c" exitCode=137 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.978511 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-f0f4-account-create-hgs5p"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.980239 4834 generic.go:334] "Generic (PLEG): container finished" podID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" exitCode=0 Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.980344 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jmkzz" event={"ID":"77e43c87-585d-4d7c-bd16-ab66b531e024","Type":"ContainerDied","Data":"a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.983169 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-l24rp_9087d728-8ea1-4f0c-aff6-7dae2fd139ec/openstack-network-exporter/0.log" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.983309 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l24rp" event={"ID":"9087d728-8ea1-4f0c-aff6-7dae2fd139ec","Type":"ContainerDied","Data":"7e8e0b124ac904f0c5eb8e8cff7c9358eec39f02813ce3062ef5855349622ca9"} Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.983503 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l24rp" Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.988900 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hkfnr"] Oct 08 22:46:49 crc kubenswrapper[4834]: I1008 22:46:49.997736 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-f0f4-account-create-hgs5p"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.002953 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6rtf\" (UniqueName: \"kubernetes.io/projected/74f0068c-4e61-4079-9d62-b338472e817d-kube-api-access-d6rtf\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: E1008 22:46:50.003044 4834 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 22:46:50 crc kubenswrapper[4834]: E1008 22:46:50.003223 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data podName:9809d14f-10d2-479f-94d9-5b3ae7f49e7b nodeName:}" failed. No retries permitted until 2025-10-08 22:46:52.003198616 +0000 UTC m=+1419.826083362 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b") : configmap "rabbitmq-cell1-config-data" not found Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.004901 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hkfnr"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.030426 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.040346 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.040544 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="caf766d3-49fe-4a20-bf0e-405ccca15c69" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://fd7251bd930a8b2a1fa53071747b9a1c8fa4231c37dca7643bd1503481f77ec1" gracePeriod=30 Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.066375 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f0068c-4e61-4079-9d62-b338472e817d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74f0068c-4e61-4079-9d62-b338472e817d" (UID: "74f0068c-4e61-4079-9d62-b338472e817d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.068567 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dl62z"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.078377 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dl62z"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.079073 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9087d728-8ea1-4f0c-aff6-7dae2fd139ec" (UID: "9087d728-8ea1-4f0c-aff6-7dae2fd139ec"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.088313 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.088530 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="92213f20-28bf-4fe1-b547-6867677b0049" containerName="nova-cell1-conductor-conductor" containerID="cri-o://16bcb11a1ca18e052f73bc017145f17afd2ff7f2d3c4ac4c7024f6ffa30d6469" gracePeriod=30 Oct 08 22:46:50 crc kubenswrapper[4834]: E1008 22:46:50.091004 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 77fd525062b3fc27651cefd56aca9e6e25cf804fe96249c8aafd5597340b6dbd is running failed: container process not found" containerID="77fd525062b3fc27651cefd56aca9e6e25cf804fe96249c8aafd5597340b6dbd" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 08 22:46:50 crc kubenswrapper[4834]: E1008 22:46:50.091523 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 77fd525062b3fc27651cefd56aca9e6e25cf804fe96249c8aafd5597340b6dbd is running failed: container process not found" containerID="77fd525062b3fc27651cefd56aca9e6e25cf804fe96249c8aafd5597340b6dbd" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 08 22:46:50 crc kubenswrapper[4834]: E1008 22:46:50.091875 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 77fd525062b3fc27651cefd56aca9e6e25cf804fe96249c8aafd5597340b6dbd is running failed: container process not found" containerID="77fd525062b3fc27651cefd56aca9e6e25cf804fe96249c8aafd5597340b6dbd" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 08 22:46:50 crc kubenswrapper[4834]: E1008 22:46:50.091971 4834 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 77fd525062b3fc27651cefd56aca9e6e25cf804fe96249c8aafd5597340b6dbd is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" containerName="ovsdbserver-nb" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.097250 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.097445 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="81641859-a43e-4d35-bc09-f541277c77da" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4" gracePeriod=30 Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.102295 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="9809d14f-10d2-479f-94d9-5b3ae7f49e7b" containerName="rabbitmq" containerID="cri-o://5d9514045c935fa894162fd45b39ec094f493d6cb728ac60cb7260b0bb6a23fa" gracePeriod=604800 Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.107306 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f0068c-4e61-4079-9d62-b338472e817d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.107344 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9087d728-8ea1-4f0c-aff6-7dae2fd139ec-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.125062 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zp2gc"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.131766 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zp2gc"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.149540 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.149804 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f120f0d7-ba00-4502-a2f3-7c619440887a" containerName="nova-scheduler-scheduler" containerID="cri-o://49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189" gracePeriod=30 Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.166211 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.192337 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f0068c-4e61-4079-9d62-b338472e817d-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "74f0068c-4e61-4079-9d62-b338472e817d" (UID: "74f0068c-4e61-4079-9d62-b338472e817d"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.192901 4834 scope.go:117] "RemoveContainer" containerID="cf8764a03b4bf1c3a07b250cdaacaf2edfad20da4aa53f6789acb7d9ee72de4d" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.194131 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_46d2d9d8-6fb8-45a7-bcba-5d6121b26dda/ovsdbserver-nb/0.log" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.194260 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.204856 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_701b75e6-1acc-47d0-85de-2349a6345a3b/ovsdbserver-sb/0.log" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.204933 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209249 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-combined-ca-bundle\") pod \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209296 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/701b75e6-1acc-47d0-85de-2349a6345a3b-scripts\") pod \"701b75e6-1acc-47d0-85de-2349a6345a3b\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209318 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-scripts\") pod \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209345 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnt5k\" (UniqueName: \"kubernetes.io/projected/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-kube-api-access-dnt5k\") pod \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209375 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-dns-swift-storage-0\") pod \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209391 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-ovsdbserver-sb\") pod \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209420 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209448 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"701b75e6-1acc-47d0-85de-2349a6345a3b\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209475 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-config\") pod \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209494 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-dns-svc\") pod \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209514 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-ovsdbserver-sb-tls-certs\") pod \"701b75e6-1acc-47d0-85de-2349a6345a3b\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209529 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-ovsdbserver-nb\") pod \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209571 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-config\") pod \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209587 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-metrics-certs-tls-certs\") pod \"701b75e6-1acc-47d0-85de-2349a6345a3b\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209609 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-ovsdbserver-nb-tls-certs\") pod \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209633 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/701b75e6-1acc-47d0-85de-2349a6345a3b-config\") pod \"701b75e6-1acc-47d0-85de-2349a6345a3b\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209655 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/701b75e6-1acc-47d0-85de-2349a6345a3b-ovsdb-rundir\") pod \"701b75e6-1acc-47d0-85de-2349a6345a3b\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209733 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h299\" (UniqueName: \"kubernetes.io/projected/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-kube-api-access-4h299\") pod \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\" (UID: \"596135ed-4d76-4dec-94bd-cf17dfbfe2d6\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209798 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-combined-ca-bundle\") pod \"701b75e6-1acc-47d0-85de-2349a6345a3b\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209827 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-metrics-certs-tls-certs\") pod \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209846 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfvwj\" (UniqueName: \"kubernetes.io/projected/701b75e6-1acc-47d0-85de-2349a6345a3b-kube-api-access-nfvwj\") pod \"701b75e6-1acc-47d0-85de-2349a6345a3b\" (UID: \"701b75e6-1acc-47d0-85de-2349a6345a3b\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.209865 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-ovsdb-rundir\") pod \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\" (UID: \"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.210319 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/74f0068c-4e61-4079-9d62-b338472e817d-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.210759 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" (UID: "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.211947 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701b75e6-1acc-47d0-85de-2349a6345a3b-config" (OuterVolumeSpecName: "config") pod "701b75e6-1acc-47d0-85de-2349a6345a3b" (UID: "701b75e6-1acc-47d0-85de-2349a6345a3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.212501 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/701b75e6-1acc-47d0-85de-2349a6345a3b-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "701b75e6-1acc-47d0-85de-2349a6345a3b" (UID: "701b75e6-1acc-47d0-85de-2349a6345a3b"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.221678 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-kube-api-access-4h299" (OuterVolumeSpecName: "kube-api-access-4h299") pod "596135ed-4d76-4dec-94bd-cf17dfbfe2d6" (UID: "596135ed-4d76-4dec-94bd-cf17dfbfe2d6"). InnerVolumeSpecName "kube-api-access-4h299". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.221983 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" (UID: "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.222621 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-config" (OuterVolumeSpecName: "config") pod "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" (UID: "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.226611 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701b75e6-1acc-47d0-85de-2349a6345a3b-scripts" (OuterVolumeSpecName: "scripts") pod "701b75e6-1acc-47d0-85de-2349a6345a3b" (UID: "701b75e6-1acc-47d0-85de-2349a6345a3b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.232925 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701b75e6-1acc-47d0-85de-2349a6345a3b-kube-api-access-nfvwj" (OuterVolumeSpecName: "kube-api-access-nfvwj") pod "701b75e6-1acc-47d0-85de-2349a6345a3b" (UID: "701b75e6-1acc-47d0-85de-2349a6345a3b"). InnerVolumeSpecName "kube-api-access-nfvwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.234557 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-kube-api-access-dnt5k" (OuterVolumeSpecName: "kube-api-access-dnt5k") pod "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" (UID: "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda"). InnerVolumeSpecName "kube-api-access-dnt5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.234938 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-scripts" (OuterVolumeSpecName: "scripts") pod "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" (UID: "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.239445 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="894c1f04-42d4-43de-a34a-19200ceec426" containerName="galera" containerID="cri-o://d646e784de1ed98c37c28ffe3ae0b7e58f0fe6b6d695716ac9a7d9f2e16a50a7" gracePeriod=30 Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.243817 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "701b75e6-1acc-47d0-85de-2349a6345a3b" (UID: "701b75e6-1acc-47d0-85de-2349a6345a3b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: E1008 22:46:50.251802 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.263561 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" (UID: "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.271341 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-7bm6j"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.281326 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="08a7721f-38a1-4a82-88ed-6f70290b5a6d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Oct 08 22:46:50 crc kubenswrapper[4834]: E1008 22:46:50.281332 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:46:50 crc kubenswrapper[4834]: E1008 22:46:50.287268 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:46:50 crc kubenswrapper[4834]: E1008 22:46:50.287322 4834 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f120f0d7-ba00-4502-a2f3-7c619440887a" containerName="nova-scheduler-scheduler" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.288426 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-7bm6j"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.312462 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.319646 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfvwj\" (UniqueName: \"kubernetes.io/projected/701b75e6-1acc-47d0-85de-2349a6345a3b-kube-api-access-nfvwj\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.319789 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.319804 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.319813 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/701b75e6-1acc-47d0-85de-2349a6345a3b-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.319823 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnt5k\" (UniqueName: \"kubernetes.io/projected/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-kube-api-access-dnt5k\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.322780 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.322806 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.322817 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.322827 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/701b75e6-1acc-47d0-85de-2349a6345a3b-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.322846 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/701b75e6-1acc-47d0-85de-2349a6345a3b-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.322855 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h299\" (UniqueName: \"kubernetes.io/projected/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-kube-api-access-4h299\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.323732 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-l24rp"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.332262 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-l24rp"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.334651 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "701b75e6-1acc-47d0-85de-2349a6345a3b" (UID: "701b75e6-1acc-47d0-85de-2349a6345a3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.428747 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.436801 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "596135ed-4d76-4dec-94bd-cf17dfbfe2d6" (UID: "596135ed-4d76-4dec-94bd-cf17dfbfe2d6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.457845 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "596135ed-4d76-4dec-94bd-cf17dfbfe2d6" (UID: "596135ed-4d76-4dec-94bd-cf17dfbfe2d6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.459953 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-config" (OuterVolumeSpecName: "config") pod "596135ed-4d76-4dec-94bd-cf17dfbfe2d6" (UID: "596135ed-4d76-4dec-94bd-cf17dfbfe2d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.477405 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" (UID: "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.477854 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "596135ed-4d76-4dec-94bd-cf17dfbfe2d6" (UID: "596135ed-4d76-4dec-94bd-cf17dfbfe2d6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.478804 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "596135ed-4d76-4dec-94bd-cf17dfbfe2d6" (UID: "596135ed-4d76-4dec-94bd-cf17dfbfe2d6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.493948 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.496952 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.533903 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.533934 4834 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.533943 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.533953 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.533961 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.533996 4834 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.534007 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/596135ed-4d76-4dec-94bd-cf17dfbfe2d6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.534016 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.545400 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" (UID: "46d2d9d8-6fb8-45a7-bcba-5d6121b26dda"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.560227 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "701b75e6-1acc-47d0-85de-2349a6345a3b" (UID: "701b75e6-1acc-47d0-85de-2349a6345a3b"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.584470 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "701b75e6-1acc-47d0-85de-2349a6345a3b" (UID: "701b75e6-1acc-47d0-85de-2349a6345a3b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.600553 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.636170 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.636208 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/701b75e6-1acc-47d0-85de-2349a6345a3b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.636241 4834 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.648673 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="9809d14f-10d2-479f-94d9-5b3ae7f49e7b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.653397 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement3088-account-delete-f6znv"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.719298 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance9716-account-delete-6zvrv"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.730193 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron3518-account-delete-sdqt9"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.751974 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-openstack-config\") pod \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.752080 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-openstack-config-secret\") pod \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.752221 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-combined-ca-bundle\") pod \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.752297 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n9ln\" (UniqueName: \"kubernetes.io/projected/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-kube-api-access-4n9ln\") pod \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\" (UID: \"b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6\") " Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.769701 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-kube-api-access-4n9ln" (OuterVolumeSpecName: "kube-api-access-4n9ln") pod "b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6" (UID: "b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6"). InnerVolumeSpecName "kube-api-access-4n9ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.775707 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6" (UID: "b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.813331 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6" (UID: "b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.858923 4834 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.858967 4834 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.859000 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n9ln\" (UniqueName: \"kubernetes.io/projected/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-kube-api-access-4n9ln\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.864189 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6" (UID: "b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.940159 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapif0f4-account-delete-b5tm7"] Oct 08 22:46:50 crc kubenswrapper[4834]: I1008 22:46:50.969322 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.018609 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-69f6cbfd5c-82mhn"] Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.019393 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" podUID="c5aa1aef-afe2-4b70-9033-c62921f3d106" containerName="proxy-httpd" containerID="cri-o://8178bb5c63d59751fa04e9a8611028105cb6a3f042c30ae51e01c44207c48306" gracePeriod=30 Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.019673 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" podUID="c5aa1aef-afe2-4b70-9033-c62921f3d106" containerName="proxy-server" containerID="cri-o://62ac2f468ed7a8ca1ccfd138476149a4798728df3269c8a1e691fb31a153f440" gracePeriod=30 Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.044080 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance9716-account-delete-6zvrv" event={"ID":"a20774f5-74f4-4f7f-9f33-b4b55585cb7d","Type":"ContainerStarted","Data":"822225feb6e18a24a8f2baaee830273a0a118824e5cfb030f7a0e4c90a000f5f"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.047976 4834 generic.go:334] "Generic (PLEG): container finished" podID="9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" containerID="16284da8c6f35291d6b17fb6dfd3cb470e7f9c4ec4f746f4bc0659d2fb85b5fb" exitCode=0 Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.048004 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c","Type":"ContainerDied","Data":"16284da8c6f35291d6b17fb6dfd3cb470e7f9c4ec4f746f4bc0659d2fb85b5fb"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.050030 4834 generic.go:334] "Generic (PLEG): container finished" podID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" containerID="e5fbf3dfab52fddd6705504ae85baf057b103dea50affd654e156b76c58f623f" exitCode=143 Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.050066 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f052dbd-010a-456f-af57-0b6b2f6e70ad","Type":"ContainerDied","Data":"e5fbf3dfab52fddd6705504ae85baf057b103dea50affd654e156b76c58f623f"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.051912 4834 generic.go:334] "Generic (PLEG): container finished" podID="e4629ae3-d685-43c9-81fd-49e84abd427f" containerID="1aa647034ef8640a17c6a15eef2e518f048d1fa10c9a9aee36b78bdc01cd48b4" exitCode=143 Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.051986 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e4629ae3-d685-43c9-81fd-49e84abd427f","Type":"ContainerDied","Data":"1aa647034ef8640a17c6a15eef2e518f048d1fa10c9a9aee36b78bdc01cd48b4"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.053333 4834 generic.go:334] "Generic (PLEG): container finished" podID="caf766d3-49fe-4a20-bf0e-405ccca15c69" containerID="fd7251bd930a8b2a1fa53071747b9a1c8fa4231c37dca7643bd1503481f77ec1" exitCode=0 Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.053380 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"caf766d3-49fe-4a20-bf0e-405ccca15c69","Type":"ContainerDied","Data":"fd7251bd930a8b2a1fa53071747b9a1c8fa4231c37dca7643bd1503481f77ec1"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.054330 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron3518-account-delete-sdqt9" event={"ID":"f70f2f55-ae76-4f8a-95a4-49933695ff6b","Type":"ContainerStarted","Data":"b8929a1875e8f2c32fdf30614626e50557abd2cda29cedaa59a0c9bbee2c83cb"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.055607 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement3088-account-delete-f6znv" event={"ID":"12663254-035f-4057-b178-2dc4d42db157","Type":"ContainerStarted","Data":"e3c11f07202ee8c306c89d2568ade8a4c62a639624b1106d102d3361b95d5a20"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.055627 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement3088-account-delete-f6znv" event={"ID":"12663254-035f-4057-b178-2dc4d42db157","Type":"ContainerStarted","Data":"6abb970dde49492b4b7392cf661dc12baca04665226a1490ba9d6d33b0e96384"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.069093 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell1c373-account-delete-cq7ht"] Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.080391 4834 generic.go:334] "Generic (PLEG): container finished" podID="2f7d4f35-145c-4af9-9f4b-de8700877370" containerID="49a4837887871793d5bd61522d4486481e468e9ccbe44afb693d4cf994046387" exitCode=143 Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.080459 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7779b9cfc5-lq477" event={"ID":"2f7d4f35-145c-4af9-9f4b-de8700877370","Type":"ContainerDied","Data":"49a4837887871793d5bd61522d4486481e468e9ccbe44afb693d4cf994046387"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.088037 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_46d2d9d8-6fb8-45a7-bcba-5d6121b26dda/ovsdbserver-nb/0.log" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.088102 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"46d2d9d8-6fb8-45a7-bcba-5d6121b26dda","Type":"ContainerDied","Data":"12e93a451c4be4fa61f58546f4fce10f4939859ee44ad0080902babf1f66944b"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.088138 4834 scope.go:117] "RemoveContainer" containerID="28c0d0fa60a03d4e0502734c048a91f1f1485f354314d1d0e3b140ec8efc322e" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.088268 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.095643 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" event={"ID":"596135ed-4d76-4dec-94bd-cf17dfbfe2d6","Type":"ContainerDied","Data":"ab715cc62a51d5f54bcf374992049fea0483cd825e3d752f6638b2a06f69ed76"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.095852 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5967cc9597-s2zsg" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.111484 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_701b75e6-1acc-47d0-85de-2349a6345a3b/ovsdbserver-sb/0.log" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.111542 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"701b75e6-1acc-47d0-85de-2349a6345a3b","Type":"ContainerDied","Data":"a34882ae29fc24ac9c03f5a7b262b297a0dfe1f7995f4b32690e15875b764330"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.111619 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.118441 4834 generic.go:334] "Generic (PLEG): container finished" podID="a163bab0-7bd2-4272-a1f0-cd0090eed141" containerID="ee03e22162c15519f33753c495b20abc4d67e8ca443ee3f90594ba5404a7a3a2" exitCode=143 Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.118534 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9f9b7d4b4-cr99t" event={"ID":"a163bab0-7bd2-4272-a1f0-cd0090eed141","Type":"ContainerDied","Data":"ee03e22162c15519f33753c495b20abc4d67e8ca443ee3f90594ba5404a7a3a2"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.157415 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="916771be477f120fd96b5a1f5443d68a2ee7c86b4f15262feb8c7da880418d12" exitCode=0 Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.157443 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="d8010142af61d9a4bcf191642a46ef203b560507fdc751d54f09b14df4705c8f" exitCode=0 Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.157450 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="06fd06ac9cbc7b6f70094375f8d6b9bf90833b141ddadf2cad95017b587d05a9" exitCode=0 Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.157486 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"916771be477f120fd96b5a1f5443d68a2ee7c86b4f15262feb8c7da880418d12"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.157511 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"d8010142af61d9a4bcf191642a46ef203b560507fdc751d54f09b14df4705c8f"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.157521 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"06fd06ac9cbc7b6f70094375f8d6b9bf90833b141ddadf2cad95017b587d05a9"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.169980 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.174975 4834 generic.go:334] "Generic (PLEG): container finished" podID="6122ff69-d6fb-4002-8679-80b826faf58f" containerID="e0129efbb3686c4ac85c467d7e795d114a6de5a26a31044df0216a19b3300c6a" exitCode=143 Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.175032 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" event={"ID":"6122ff69-d6fb-4002-8679-80b826faf58f","Type":"ContainerDied","Data":"e0129efbb3686c4ac85c467d7e795d114a6de5a26a31044df0216a19b3300c6a"} Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.233158 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.242851 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.248524 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5967cc9597-s2zsg"] Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.254817 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5967cc9597-s2zsg"] Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.261184 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.266333 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.272408 4834 scope.go:117] "RemoveContainer" containerID="77fd525062b3fc27651cefd56aca9e6e25cf804fe96249c8aafd5597340b6dbd" Oct 08 22:46:51 crc kubenswrapper[4834]: E1008 22:46:51.327262 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="befa7386c77d07b3be61cbc85442566df26dcee9bc664cf8da1c08dd1f7c92d7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 08 22:46:51 crc kubenswrapper[4834]: E1008 22:46:51.330525 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="befa7386c77d07b3be61cbc85442566df26dcee9bc664cf8da1c08dd1f7c92d7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 08 22:46:51 crc kubenswrapper[4834]: E1008 22:46:51.341055 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="befa7386c77d07b3be61cbc85442566df26dcee9bc664cf8da1c08dd1f7c92d7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 08 22:46:51 crc kubenswrapper[4834]: E1008 22:46:51.341126 4834 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="00e05134-e159-40fe-9c63-a0dc406c8dee" containerName="ovn-northd" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.565726 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ccc4594-4823-4520-af7d-213d6dac2490" path="/var/lib/kubelet/pods/0ccc4594-4823-4520-af7d-213d6dac2490/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.566427 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26081853-747e-4f7b-af9b-819dc967f807" path="/var/lib/kubelet/pods/26081853-747e-4f7b-af9b-819dc967f807/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.566992 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f60664-0524-49bc-8e17-19305b2ae60a" path="/var/lib/kubelet/pods/30f60664-0524-49bc-8e17-19305b2ae60a/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.568125 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" path="/var/lib/kubelet/pods/46d2d9d8-6fb8-45a7-bcba-5d6121b26dda/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.568817 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4adf21b4-b07b-41df-b887-29580e96f8b9" path="/var/lib/kubelet/pods/4adf21b4-b07b-41df-b887-29580e96f8b9/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.570557 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d347a09-693b-4c37-8a0c-5143e16fd9f8" path="/var/lib/kubelet/pods/4d347a09-693b-4c37-8a0c-5143e16fd9f8/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.571126 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596135ed-4d76-4dec-94bd-cf17dfbfe2d6" path="/var/lib/kubelet/pods/596135ed-4d76-4dec-94bd-cf17dfbfe2d6/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.572757 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6082b9c6-252a-49ee-bcd5-7d58bd99ff23" path="/var/lib/kubelet/pods/6082b9c6-252a-49ee-bcd5-7d58bd99ff23/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.573338 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701b75e6-1acc-47d0-85de-2349a6345a3b" path="/var/lib/kubelet/pods/701b75e6-1acc-47d0-85de-2349a6345a3b/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.573838 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="739f746e-d763-46b9-9512-1c8dde821ada" path="/var/lib/kubelet/pods/739f746e-d763-46b9-9512-1c8dde821ada/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.574956 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74f0068c-4e61-4079-9d62-b338472e817d" path="/var/lib/kubelet/pods/74f0068c-4e61-4079-9d62-b338472e817d/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.575667 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9087d728-8ea1-4f0c-aff6-7dae2fd139ec" path="/var/lib/kubelet/pods/9087d728-8ea1-4f0c-aff6-7dae2fd139ec/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.576491 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d1ee346-0de8-44e6-a240-05669af5a41e" path="/var/lib/kubelet/pods/9d1ee346-0de8-44e6-a240-05669af5a41e/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.579344 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6f6cc747c5-vzjm2" podUID="62795e13-2e9c-4656-ab88-8788e50d37c5" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.165:9696/\": dial tcp 10.217.0.165:9696: connect: connection refused" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.579816 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6" path="/var/lib/kubelet/pods/b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.580403 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eafc6c7f-9e9b-4232-b0dc-82225a78e1d2" path="/var/lib/kubelet/pods/eafc6c7f-9e9b-4232-b0dc-82225a78e1d2/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.580922 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff1278e3-01d3-4a69-9689-caafc578bbb0" path="/var/lib/kubelet/pods/ff1278e3-01d3-4a69-9689-caafc578bbb0/volumes" Oct 08 22:46:51 crc kubenswrapper[4834]: E1008 22:46:51.615730 4834 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 22:46:51 crc kubenswrapper[4834]: E1008 22:46:51.615806 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data podName:08a7721f-38a1-4a82-88ed-6f70290b5a6d nodeName:}" failed. No retries permitted until 2025-10-08 22:46:55.615788178 +0000 UTC m=+1423.438672914 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data") pod "rabbitmq-server-0" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d") : configmap "rabbitmq-config-data" not found Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.616326 4834 scope.go:117] "RemoveContainer" containerID="50d402217bd3c2796ec12c38a36a0c886b0ae574eb8874d5e4be8b40f8de8693" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.703167 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.733844 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement3088-account-delete-f6znv" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.736029 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.781403 4834 scope.go:117] "RemoveContainer" containerID="6b374ba9360aed2a69e514af5204d8b7f9a4d40ce6ebd3eac7796f0331f0d312" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.813584 4834 scope.go:117] "RemoveContainer" containerID="e630c1079c525a16879ee972cc2c3b32e171adbd6c3917c5714b8770364bfc76" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.821800 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-operator-scripts\") pod \"894c1f04-42d4-43de-a34a-19200ceec426\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.821885 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-secrets\") pod \"894c1f04-42d4-43de-a34a-19200ceec426\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.821910 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-config-data-default\") pod \"894c1f04-42d4-43de-a34a-19200ceec426\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.821949 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-nova-novncproxy-tls-certs\") pod \"caf766d3-49fe-4a20-bf0e-405ccca15c69\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.821977 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hpfd\" (UniqueName: \"kubernetes.io/projected/caf766d3-49fe-4a20-bf0e-405ccca15c69-kube-api-access-8hpfd\") pod \"caf766d3-49fe-4a20-bf0e-405ccca15c69\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.822045 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-config-data\") pod \"caf766d3-49fe-4a20-bf0e-405ccca15c69\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.822068 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"894c1f04-42d4-43de-a34a-19200ceec426\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.822095 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-combined-ca-bundle\") pod \"894c1f04-42d4-43de-a34a-19200ceec426\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.822131 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjnjf\" (UniqueName: \"kubernetes.io/projected/894c1f04-42d4-43de-a34a-19200ceec426-kube-api-access-kjnjf\") pod \"894c1f04-42d4-43de-a34a-19200ceec426\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.822184 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-galera-tls-certs\") pod \"894c1f04-42d4-43de-a34a-19200ceec426\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.822207 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4kbd\" (UniqueName: \"kubernetes.io/projected/12663254-035f-4057-b178-2dc4d42db157-kube-api-access-n4kbd\") pod \"12663254-035f-4057-b178-2dc4d42db157\" (UID: \"12663254-035f-4057-b178-2dc4d42db157\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.822271 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-vencrypt-tls-certs\") pod \"caf766d3-49fe-4a20-bf0e-405ccca15c69\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.822306 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-combined-ca-bundle\") pod \"caf766d3-49fe-4a20-bf0e-405ccca15c69\" (UID: \"caf766d3-49fe-4a20-bf0e-405ccca15c69\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.822345 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-kolla-config\") pod \"894c1f04-42d4-43de-a34a-19200ceec426\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.822387 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/894c1f04-42d4-43de-a34a-19200ceec426-config-data-generated\") pod \"894c1f04-42d4-43de-a34a-19200ceec426\" (UID: \"894c1f04-42d4-43de-a34a-19200ceec426\") " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.822694 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "894c1f04-42d4-43de-a34a-19200ceec426" (UID: "894c1f04-42d4-43de-a34a-19200ceec426"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.826775 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "894c1f04-42d4-43de-a34a-19200ceec426" (UID: "894c1f04-42d4-43de-a34a-19200ceec426"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.826865 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/894c1f04-42d4-43de-a34a-19200ceec426-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "894c1f04-42d4-43de-a34a-19200ceec426" (UID: "894c1f04-42d4-43de-a34a-19200ceec426"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.826961 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "894c1f04-42d4-43de-a34a-19200ceec426" (UID: "894c1f04-42d4-43de-a34a-19200ceec426"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.827613 4834 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.828050 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/894c1f04-42d4-43de-a34a-19200ceec426-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.828213 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.828283 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/894c1f04-42d4-43de-a34a-19200ceec426-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.835623 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caf766d3-49fe-4a20-bf0e-405ccca15c69-kube-api-access-8hpfd" (OuterVolumeSpecName: "kube-api-access-8hpfd") pod "caf766d3-49fe-4a20-bf0e-405ccca15c69" (UID: "caf766d3-49fe-4a20-bf0e-405ccca15c69"). InnerVolumeSpecName "kube-api-access-8hpfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.842211 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12663254-035f-4057-b178-2dc4d42db157-kube-api-access-n4kbd" (OuterVolumeSpecName: "kube-api-access-n4kbd") pod "12663254-035f-4057-b178-2dc4d42db157" (UID: "12663254-035f-4057-b178-2dc4d42db157"). InnerVolumeSpecName "kube-api-access-n4kbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.842980 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/894c1f04-42d4-43de-a34a-19200ceec426-kube-api-access-kjnjf" (OuterVolumeSpecName: "kube-api-access-kjnjf") pod "894c1f04-42d4-43de-a34a-19200ceec426" (UID: "894c1f04-42d4-43de-a34a-19200ceec426"). InnerVolumeSpecName "kube-api-access-kjnjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.843322 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-secrets" (OuterVolumeSpecName: "secrets") pod "894c1f04-42d4-43de-a34a-19200ceec426" (UID: "894c1f04-42d4-43de-a34a-19200ceec426"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.845214 4834 scope.go:117] "RemoveContainer" containerID="f5e73cfdd6ec9a2f76bfa0052b93e5d772c1194c5cd98642eb2f1e306db7ed10" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.862042 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "894c1f04-42d4-43de-a34a-19200ceec426" (UID: "894c1f04-42d4-43de-a34a-19200ceec426"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.901852 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "894c1f04-42d4-43de-a34a-19200ceec426" (UID: "894c1f04-42d4-43de-a34a-19200ceec426"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.901865 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "caf766d3-49fe-4a20-bf0e-405ccca15c69" (UID: "caf766d3-49fe-4a20-bf0e-405ccca15c69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.905902 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-config-data" (OuterVolumeSpecName: "config-data") pod "caf766d3-49fe-4a20-bf0e-405ccca15c69" (UID: "caf766d3-49fe-4a20-bf0e-405ccca15c69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.929576 4834 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.929602 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hpfd\" (UniqueName: \"kubernetes.io/projected/caf766d3-49fe-4a20-bf0e-405ccca15c69-kube-api-access-8hpfd\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.929611 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.929630 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.929641 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.929651 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjnjf\" (UniqueName: \"kubernetes.io/projected/894c1f04-42d4-43de-a34a-19200ceec426-kube-api-access-kjnjf\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.929660 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4kbd\" (UniqueName: \"kubernetes.io/projected/12663254-035f-4057-b178-2dc4d42db157-kube-api-access-n4kbd\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.929670 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.932546 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "caf766d3-49fe-4a20-bf0e-405ccca15c69" (UID: "caf766d3-49fe-4a20-bf0e-405ccca15c69"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.965880 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "caf766d3-49fe-4a20-bf0e-405ccca15c69" (UID: "caf766d3-49fe-4a20-bf0e-405ccca15c69"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.980907 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "894c1f04-42d4-43de-a34a-19200ceec426" (UID: "894c1f04-42d4-43de-a34a-19200ceec426"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:51 crc kubenswrapper[4834]: I1008 22:46:51.989820 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.031283 4834 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.031309 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.031318 4834 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/894c1f04-42d4-43de-a34a-19200ceec426-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.031329 4834 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/caf766d3-49fe-4a20-bf0e-405ccca15c69-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:52 crc kubenswrapper[4834]: E1008 22:46:52.031396 4834 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 22:46:52 crc kubenswrapper[4834]: E1008 22:46:52.031466 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data podName:9809d14f-10d2-479f-94d9-5b3ae7f49e7b nodeName:}" failed. No retries permitted until 2025-10-08 22:46:56.031429308 +0000 UTC m=+1423.854314054 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b") : configmap "rabbitmq-cell1-config-data" not found Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.191556 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"caf766d3-49fe-4a20-bf0e-405ccca15c69","Type":"ContainerDied","Data":"0b21c10d816c2c7153b5c45b8db39059b03941a188f9cac22497427fa550ca0b"} Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.191583 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.195937 4834 generic.go:334] "Generic (PLEG): container finished" podID="c5aa1aef-afe2-4b70-9033-c62921f3d106" containerID="62ac2f468ed7a8ca1ccfd138476149a4798728df3269c8a1e691fb31a153f440" exitCode=0 Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.195973 4834 generic.go:334] "Generic (PLEG): container finished" podID="c5aa1aef-afe2-4b70-9033-c62921f3d106" containerID="8178bb5c63d59751fa04e9a8611028105cb6a3f042c30ae51e01c44207c48306" exitCode=0 Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.196035 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" event={"ID":"c5aa1aef-afe2-4b70-9033-c62921f3d106","Type":"ContainerDied","Data":"62ac2f468ed7a8ca1ccfd138476149a4798728df3269c8a1e691fb31a153f440"} Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.196212 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" event={"ID":"c5aa1aef-afe2-4b70-9033-c62921f3d106","Type":"ContainerDied","Data":"8178bb5c63d59751fa04e9a8611028105cb6a3f042c30ae51e01c44207c48306"} Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.196233 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" event={"ID":"c5aa1aef-afe2-4b70-9033-c62921f3d106","Type":"ContainerDied","Data":"a42c2b464216c2c57060cfdd7711c877ae77f5b4d9f363897d48919f5a6b5ef8"} Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.196249 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a42c2b464216c2c57060cfdd7711c877ae77f5b4d9f363897d48919f5a6b5ef8" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.206485 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.206622 4834 generic.go:334] "Generic (PLEG): container finished" podID="b0f06117-94bf-4e56-b5f7-e83eda8ee811" containerID="c391cf57110fa1339292e25b19d31146c4d1fa0f41165a579239c24b64206ecb" exitCode=1 Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.206681 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1c373-account-delete-cq7ht" event={"ID":"b0f06117-94bf-4e56-b5f7-e83eda8ee811","Type":"ContainerDied","Data":"c391cf57110fa1339292e25b19d31146c4d1fa0f41165a579239c24b64206ecb"} Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.206707 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1c373-account-delete-cq7ht" event={"ID":"b0f06117-94bf-4e56-b5f7-e83eda8ee811","Type":"ContainerStarted","Data":"fa3103a36b666b6735c3c7ffea80ec5d31ab7e50a3e99858c700840f29a675b3"} Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.207260 4834 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell1c373-account-delete-cq7ht" secret="" err="secret \"galera-openstack-cell1-dockercfg-xbdnd\" not found" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.207297 4834 scope.go:117] "RemoveContainer" containerID="c391cf57110fa1339292e25b19d31146c4d1fa0f41165a579239c24b64206ecb" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.210383 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.210340 4834 generic.go:334] "Generic (PLEG): container finished" podID="894c1f04-42d4-43de-a34a-19200ceec426" containerID="d646e784de1ed98c37c28ffe3ae0b7e58f0fe6b6d695716ac9a7d9f2e16a50a7" exitCode=0 Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.210570 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"894c1f04-42d4-43de-a34a-19200ceec426","Type":"ContainerDied","Data":"d646e784de1ed98c37c28ffe3ae0b7e58f0fe6b6d695716ac9a7d9f2e16a50a7"} Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.210611 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"894c1f04-42d4-43de-a34a-19200ceec426","Type":"ContainerDied","Data":"3a5f11e1a18b3ab6fb6d344716492986e86282b3ecff839b1e9a50641468d1e3"} Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.215181 4834 generic.go:334] "Generic (PLEG): container finished" podID="a20774f5-74f4-4f7f-9f33-b4b55585cb7d" containerID="bf01ad0d24b4fd615e20d17e37792f6e0fc245dd9fd365f332d5c5a9ba9492d0" exitCode=0 Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.215456 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance9716-account-delete-6zvrv" event={"ID":"a20774f5-74f4-4f7f-9f33-b4b55585cb7d","Type":"ContainerDied","Data":"bf01ad0d24b4fd615e20d17e37792f6e0fc245dd9fd365f332d5c5a9ba9492d0"} Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.225423 4834 scope.go:117] "RemoveContainer" containerID="048b028a05ed9d1e34b226eae2432e5d45037864ccaf2322ecdfa230f03f479c" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.227712 4834 generic.go:334] "Generic (PLEG): container finished" podID="f70f2f55-ae76-4f8a-95a4-49933695ff6b" containerID="1e8ddb85f75ede5ff35a353d39c39d0175ca89fe8ddc6c8c24ac6790a0da17e1" exitCode=0 Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.227761 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron3518-account-delete-sdqt9" event={"ID":"f70f2f55-ae76-4f8a-95a4-49933695ff6b","Type":"ContainerDied","Data":"1e8ddb85f75ede5ff35a353d39c39d0175ca89fe8ddc6c8c24ac6790a0da17e1"} Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.242167 4834 generic.go:334] "Generic (PLEG): container finished" podID="12663254-035f-4057-b178-2dc4d42db157" containerID="e3c11f07202ee8c306c89d2568ade8a4c62a639624b1106d102d3361b95d5a20" exitCode=0 Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.242251 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement3088-account-delete-f6znv" event={"ID":"12663254-035f-4057-b178-2dc4d42db157","Type":"ContainerDied","Data":"e3c11f07202ee8c306c89d2568ade8a4c62a639624b1106d102d3361b95d5a20"} Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.242280 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement3088-account-delete-f6znv" event={"ID":"12663254-035f-4057-b178-2dc4d42db157","Type":"ContainerDied","Data":"6abb970dde49492b4b7392cf661dc12baca04665226a1490ba9d6d33b0e96384"} Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.242305 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement3088-account-delete-f6znv" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.255775 4834 generic.go:334] "Generic (PLEG): container finished" podID="2851fb85-5e8a-46af-9cac-d4df0c5eb16a" containerID="1fd094e821088e300fb522feee24e3f5c1422f0a0b776c6baafc3a9ef2e8e60d" exitCode=0 Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.255814 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapif0f4-account-delete-b5tm7" event={"ID":"2851fb85-5e8a-46af-9cac-d4df0c5eb16a","Type":"ContainerDied","Data":"1fd094e821088e300fb522feee24e3f5c1422f0a0b776c6baafc3a9ef2e8e60d"} Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.255838 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapif0f4-account-delete-b5tm7" event={"ID":"2851fb85-5e8a-46af-9cac-d4df0c5eb16a","Type":"ContainerStarted","Data":"789e17d413f9d778a27de5438ff6be91b6f771abbc0206cd95c2d274b95241cf"} Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.263169 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.277197 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.325056 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.325295 4834 scope.go:117] "RemoveContainer" containerID="fd7251bd930a8b2a1fa53071747b9a1c8fa4231c37dca7643bd1503481f77ec1" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.336807 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5aa1aef-afe2-4b70-9033-c62921f3d106-run-httpd\") pod \"c5aa1aef-afe2-4b70-9033-c62921f3d106\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.336884 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-public-tls-certs\") pod \"c5aa1aef-afe2-4b70-9033-c62921f3d106\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.336917 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z64xb\" (UniqueName: \"kubernetes.io/projected/c5aa1aef-afe2-4b70-9033-c62921f3d106-kube-api-access-z64xb\") pod \"c5aa1aef-afe2-4b70-9033-c62921f3d106\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.336949 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-combined-ca-bundle\") pod \"c5aa1aef-afe2-4b70-9033-c62921f3d106\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.336977 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c5aa1aef-afe2-4b70-9033-c62921f3d106-etc-swift\") pod \"c5aa1aef-afe2-4b70-9033-c62921f3d106\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.336996 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5aa1aef-afe2-4b70-9033-c62921f3d106-log-httpd\") pod \"c5aa1aef-afe2-4b70-9033-c62921f3d106\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.337065 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-internal-tls-certs\") pod \"c5aa1aef-afe2-4b70-9033-c62921f3d106\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.337096 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-config-data\") pod \"c5aa1aef-afe2-4b70-9033-c62921f3d106\" (UID: \"c5aa1aef-afe2-4b70-9033-c62921f3d106\") " Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.338022 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5aa1aef-afe2-4b70-9033-c62921f3d106-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c5aa1aef-afe2-4b70-9033-c62921f3d106" (UID: "c5aa1aef-afe2-4b70-9033-c62921f3d106"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.339500 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.339819 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5aa1aef-afe2-4b70-9033-c62921f3d106-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c5aa1aef-afe2-4b70-9033-c62921f3d106" (UID: "c5aa1aef-afe2-4b70-9033-c62921f3d106"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.385356 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5aa1aef-afe2-4b70-9033-c62921f3d106-kube-api-access-z64xb" (OuterVolumeSpecName: "kube-api-access-z64xb") pod "c5aa1aef-afe2-4b70-9033-c62921f3d106" (UID: "c5aa1aef-afe2-4b70-9033-c62921f3d106"). InnerVolumeSpecName "kube-api-access-z64xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.388836 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5aa1aef-afe2-4b70-9033-c62921f3d106-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c5aa1aef-afe2-4b70-9033-c62921f3d106" (UID: "c5aa1aef-afe2-4b70-9033-c62921f3d106"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.394713 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement3088-account-delete-f6znv"] Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.403546 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement3088-account-delete-f6znv"] Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.415467 4834 scope.go:117] "RemoveContainer" containerID="d646e784de1ed98c37c28ffe3ae0b7e58f0fe6b6d695716ac9a7d9f2e16a50a7" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.438565 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z64xb\" (UniqueName: \"kubernetes.io/projected/c5aa1aef-afe2-4b70-9033-c62921f3d106-kube-api-access-z64xb\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.438598 4834 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c5aa1aef-afe2-4b70-9033-c62921f3d106-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.438608 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5aa1aef-afe2-4b70-9033-c62921f3d106-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.438615 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c5aa1aef-afe2-4b70-9033-c62921f3d106-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.465615 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c5aa1aef-afe2-4b70-9033-c62921f3d106" (UID: "c5aa1aef-afe2-4b70-9033-c62921f3d106"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.473114 4834 scope.go:117] "RemoveContainer" containerID="76243ea0ef140a340cb63efe65ec59c7b2f1673fd849f87f10bcff1ba8e296fa" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.484559 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5aa1aef-afe2-4b70-9033-c62921f3d106" (UID: "c5aa1aef-afe2-4b70-9033-c62921f3d106"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.495998 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c5aa1aef-afe2-4b70-9033-c62921f3d106" (UID: "c5aa1aef-afe2-4b70-9033-c62921f3d106"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.526378 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-config-data" (OuterVolumeSpecName: "config-data") pod "c5aa1aef-afe2-4b70-9033-c62921f3d106" (UID: "c5aa1aef-afe2-4b70-9033-c62921f3d106"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.540352 4834 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.540391 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.540403 4834 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.540414 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5aa1aef-afe2-4b70-9033-c62921f3d106-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.545420 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.545682 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="ceilometer-central-agent" containerID="cri-o://647473fbbcb274c0e2ff66facfea42e5a01db54b90650c2eb3272770f97d4568" gracePeriod=30 Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.546068 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="proxy-httpd" containerID="cri-o://1032f4edc7088e996cbec692eba1d575d4cb67e5112593147d72e8f6a8aadd15" gracePeriod=30 Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.546130 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="sg-core" containerID="cri-o://6c68965f390b060b5d3c620a960a9119d4e278229de82c1adb3ab74e578a87f2" gracePeriod=30 Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.546178 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="ceilometer-notification-agent" containerID="cri-o://cf0ae15b1941b4dd9f887d569bab53002471c7132e035bbb1f9a2f478ae20d66" gracePeriod=30 Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.563666 4834 scope.go:117] "RemoveContainer" containerID="d646e784de1ed98c37c28ffe3ae0b7e58f0fe6b6d695716ac9a7d9f2e16a50a7" Oct 08 22:46:52 crc kubenswrapper[4834]: E1008 22:46:52.575981 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d646e784de1ed98c37c28ffe3ae0b7e58f0fe6b6d695716ac9a7d9f2e16a50a7\": container with ID starting with d646e784de1ed98c37c28ffe3ae0b7e58f0fe6b6d695716ac9a7d9f2e16a50a7 not found: ID does not exist" containerID="d646e784de1ed98c37c28ffe3ae0b7e58f0fe6b6d695716ac9a7d9f2e16a50a7" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.576035 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d646e784de1ed98c37c28ffe3ae0b7e58f0fe6b6d695716ac9a7d9f2e16a50a7"} err="failed to get container status \"d646e784de1ed98c37c28ffe3ae0b7e58f0fe6b6d695716ac9a7d9f2e16a50a7\": rpc error: code = NotFound desc = could not find container \"d646e784de1ed98c37c28ffe3ae0b7e58f0fe6b6d695716ac9a7d9f2e16a50a7\": container with ID starting with d646e784de1ed98c37c28ffe3ae0b7e58f0fe6b6d695716ac9a7d9f2e16a50a7 not found: ID does not exist" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.576065 4834 scope.go:117] "RemoveContainer" containerID="76243ea0ef140a340cb63efe65ec59c7b2f1673fd849f87f10bcff1ba8e296fa" Oct 08 22:46:52 crc kubenswrapper[4834]: E1008 22:46:52.611209 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76243ea0ef140a340cb63efe65ec59c7b2f1673fd849f87f10bcff1ba8e296fa\": container with ID starting with 76243ea0ef140a340cb63efe65ec59c7b2f1673fd849f87f10bcff1ba8e296fa not found: ID does not exist" containerID="76243ea0ef140a340cb63efe65ec59c7b2f1673fd849f87f10bcff1ba8e296fa" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.611276 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76243ea0ef140a340cb63efe65ec59c7b2f1673fd849f87f10bcff1ba8e296fa"} err="failed to get container status \"76243ea0ef140a340cb63efe65ec59c7b2f1673fd849f87f10bcff1ba8e296fa\": rpc error: code = NotFound desc = could not find container \"76243ea0ef140a340cb63efe65ec59c7b2f1673fd849f87f10bcff1ba8e296fa\": container with ID starting with 76243ea0ef140a340cb63efe65ec59c7b2f1673fd849f87f10bcff1ba8e296fa not found: ID does not exist" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.611307 4834 scope.go:117] "RemoveContainer" containerID="e3c11f07202ee8c306c89d2568ade8a4c62a639624b1106d102d3361b95d5a20" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.632441 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.632708 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8b73c297-7a02-46b4-88bf-30b239655df8" containerName="kube-state-metrics" containerID="cri-o://e4758aa91899dd39f5a62ab08ecb06485dc5e662e93954ee17513d9d7aaa9349" gracePeriod=30 Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.742216 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.742488 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="168e9a74-197a-4210-a553-7162c2f521af" containerName="memcached" containerID="cri-o://cc2cc2df524ac30c517a6da54a2a306187af9e9dd24a45e3d7df0f7dc25a73b1" gracePeriod=30 Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.816480 4834 scope.go:117] "RemoveContainer" containerID="e3c11f07202ee8c306c89d2568ade8a4c62a639624b1106d102d3361b95d5a20" Oct 08 22:46:52 crc kubenswrapper[4834]: E1008 22:46:52.817813 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3c11f07202ee8c306c89d2568ade8a4c62a639624b1106d102d3361b95d5a20\": container with ID starting with e3c11f07202ee8c306c89d2568ade8a4c62a639624b1106d102d3361b95d5a20 not found: ID does not exist" containerID="e3c11f07202ee8c306c89d2568ade8a4c62a639624b1106d102d3361b95d5a20" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.817869 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3c11f07202ee8c306c89d2568ade8a4c62a639624b1106d102d3361b95d5a20"} err="failed to get container status \"e3c11f07202ee8c306c89d2568ade8a4c62a639624b1106d102d3361b95d5a20\": rpc error: code = NotFound desc = could not find container \"e3c11f07202ee8c306c89d2568ade8a4c62a639624b1106d102d3361b95d5a20\": container with ID starting with e3c11f07202ee8c306c89d2568ade8a4c62a639624b1106d102d3361b95d5a20 not found: ID does not exist" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.835318 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8rqtg"] Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.889218 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-7bmqf"] Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.890274 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="2fede876-b44b-40e1-8c56-9c35d2528e37" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.174:8776/healthcheck\": read tcp 10.217.0.2:48222->10.217.0.174:8776: read: connection reset by peer" Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.944219 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-7bmqf"] Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.983397 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8rqtg"] Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.995625 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-57cf4d469b-9sj2l"] Oct 08 22:46:52 crc kubenswrapper[4834]: I1008 22:46:52.995808 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-57cf4d469b-9sj2l" podUID="788f2464-05b4-4c9a-bd83-6c1365740166" containerName="keystone-api" containerID="cri-o://1604636fd334c22a16f8d495172a59b50e33f1ed2b35af73ba1fc55f9dd3f3c9" gracePeriod=30 Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.004205 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.011887 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-zrv7r"] Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.017347 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-zrv7r"] Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.024113 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2c76-account-create-q2px5"] Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.028792 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2c76-account-create-q2px5"] Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.255029 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="34aacb58-3b8d-466d-9b71-e7098b95fe8e" containerName="galera" containerID="cri-o://3c24af656d20cb96d210268f8f068f5cf9e967d712c1e772384d7063e6db6c03" gracePeriod=30 Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.275493 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:59456->10.217.0.203:8775: read: connection reset by peer" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.276076 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:59444->10.217.0.203:8775: read: connection reset by peer" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.294920 4834 generic.go:334] "Generic (PLEG): container finished" podID="b0f06117-94bf-4e56-b5f7-e83eda8ee811" containerID="41e88b8f67a6a8a7ddfed83ceafbca7ec43e29b70ff238cd03e0147a41767dbc" exitCode=1 Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.295547 4834 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell1c373-account-delete-cq7ht" secret="" err="secret \"galera-openstack-cell1-dockercfg-xbdnd\" not found" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.295578 4834 scope.go:117] "RemoveContainer" containerID="41e88b8f67a6a8a7ddfed83ceafbca7ec43e29b70ff238cd03e0147a41767dbc" Oct 08 22:46:53 crc kubenswrapper[4834]: E1008 22:46:53.295956 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=novacell1c373-account-delete-cq7ht_openstack(b0f06117-94bf-4e56-b5f7-e83eda8ee811)\"" pod="openstack/novacell1c373-account-delete-cq7ht" podUID="b0f06117-94bf-4e56-b5f7-e83eda8ee811" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.296308 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1c373-account-delete-cq7ht" event={"ID":"b0f06117-94bf-4e56-b5f7-e83eda8ee811","Type":"ContainerDied","Data":"41e88b8f67a6a8a7ddfed83ceafbca7ec43e29b70ff238cd03e0147a41767dbc"} Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.296339 4834 scope.go:117] "RemoveContainer" containerID="c391cf57110fa1339292e25b19d31146c4d1fa0f41165a579239c24b64206ecb" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.329526 4834 generic.go:334] "Generic (PLEG): container finished" podID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerID="6c68965f390b060b5d3c620a960a9119d4e278229de82c1adb3ab74e578a87f2" exitCode=2 Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.329612 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2e3be8-465e-4b20-9586-387cd8d9ca67","Type":"ContainerDied","Data":"6c68965f390b060b5d3c620a960a9119d4e278229de82c1adb3ab74e578a87f2"} Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.348267 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron3518-account-delete-sdqt9" event={"ID":"f70f2f55-ae76-4f8a-95a4-49933695ff6b","Type":"ContainerDied","Data":"b8929a1875e8f2c32fdf30614626e50557abd2cda29cedaa59a0c9bbee2c83cb"} Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.348319 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8929a1875e8f2c32fdf30614626e50557abd2cda29cedaa59a0c9bbee2c83cb" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.363042 4834 generic.go:334] "Generic (PLEG): container finished" podID="37143980-a3f8-4398-a1d7-0f8189fb5366" containerID="c3f0ee497d77bf25a33bb1a3381c77e479b17a2eac69da1fa447bfb0e183a0e4" exitCode=0 Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.363130 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37143980-a3f8-4398-a1d7-0f8189fb5366","Type":"ContainerDied","Data":"c3f0ee497d77bf25a33bb1a3381c77e479b17a2eac69da1fa447bfb0e183a0e4"} Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.368734 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hk2mh"] Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.381574 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hk2mh"] Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.384434 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1c373-account-delete-cq7ht"] Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.393513 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c373-account-create-ld9q7"] Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.393543 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapif0f4-account-delete-b5tm7" event={"ID":"2851fb85-5e8a-46af-9cac-d4df0c5eb16a","Type":"ContainerDied","Data":"789e17d413f9d778a27de5438ff6be91b6f771abbc0206cd95c2d274b95241cf"} Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.393563 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="789e17d413f9d778a27de5438ff6be91b6f771abbc0206cd95c2d274b95241cf" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.396734 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron3518-account-delete-sdqt9" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.404755 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapif0f4-account-delete-b5tm7" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.428613 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.432601 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c373-account-create-ld9q7"] Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.433653 4834 generic.go:334] "Generic (PLEG): container finished" podID="8b73c297-7a02-46b4-88bf-30b239655df8" containerID="e4758aa91899dd39f5a62ab08ecb06485dc5e662e93954ee17513d9d7aaa9349" exitCode=2 Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.433807 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8b73c297-7a02-46b4-88bf-30b239655df8","Type":"ContainerDied","Data":"e4758aa91899dd39f5a62ab08ecb06485dc5e662e93954ee17513d9d7aaa9349"} Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.451470 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance9716-account-delete-6zvrv" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.484314 4834 generic.go:334] "Generic (PLEG): container finished" podID="a163bab0-7bd2-4272-a1f0-cd0090eed141" containerID="fd63b549a1986038c721e698592fdda290ebfb583fc9734ad9ac998ea85e14d3" exitCode=0 Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.484414 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9f9b7d4b4-cr99t" event={"ID":"a163bab0-7bd2-4272-a1f0-cd0090eed141","Type":"ContainerDied","Data":"fd63b549a1986038c721e698592fdda290ebfb583fc9734ad9ac998ea85e14d3"} Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.504967 4834 generic.go:334] "Generic (PLEG): container finished" podID="6c5d01ab-b923-4829-9b10-6ad9010216eb" containerID="1057d7b178b738e910bc1a7a9841be19ad4ff8504d2eb4e2c8e8ff4bf273e1f0" exitCode=0 Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.505097 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55b44744c4-z2p4d" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.505571 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55b44744c4-z2p4d" event={"ID":"6c5d01ab-b923-4829-9b10-6ad9010216eb","Type":"ContainerDied","Data":"1057d7b178b738e910bc1a7a9841be19ad4ff8504d2eb4e2c8e8ff4bf273e1f0"} Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.505671 4834 scope.go:117] "RemoveContainer" containerID="1057d7b178b738e910bc1a7a9841be19ad4ff8504d2eb4e2c8e8ff4bf273e1f0" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.517080 4834 generic.go:334] "Generic (PLEG): container finished" podID="f1c297e1-ec55-4113-a87d-7813a27c03d9" containerID="7af54c4f4905381853f354524b17a405dd2c9d5ab3d098a361ea339c61a15d5d" exitCode=0 Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.517161 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f1c297e1-ec55-4113-a87d-7813a27c03d9","Type":"ContainerDied","Data":"7af54c4f4905381853f354524b17a405dd2c9d5ab3d098a361ea339c61a15d5d"} Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.522818 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance9716-account-delete-6zvrv" event={"ID":"a20774f5-74f4-4f7f-9f33-b4b55585cb7d","Type":"ContainerDied","Data":"822225feb6e18a24a8f2baaee830273a0a118824e5cfb030f7a0e4c90a000f5f"} Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.522881 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance9716-account-delete-6zvrv" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.525210 4834 generic.go:334] "Generic (PLEG): container finished" podID="2fede876-b44b-40e1-8c56-9c35d2528e37" containerID="577329ac5e86fe36588abe9f509038517d2bba0f083da6b616ddb28e28603822" exitCode=0 Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.525342 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fede876-b44b-40e1-8c56-9c35d2528e37","Type":"ContainerDied","Data":"577329ac5e86fe36588abe9f509038517d2bba0f083da6b616ddb28e28603822"} Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.533902 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5d01ab-b923-4829-9b10-6ad9010216eb-logs\") pod \"6c5d01ab-b923-4829-9b10-6ad9010216eb\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.534008 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-public-tls-certs\") pod \"6c5d01ab-b923-4829-9b10-6ad9010216eb\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.534036 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-config-data\") pod \"6c5d01ab-b923-4829-9b10-6ad9010216eb\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.534070 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-internal-tls-certs\") pod \"6c5d01ab-b923-4829-9b10-6ad9010216eb\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.534183 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4zsb\" (UniqueName: \"kubernetes.io/projected/2851fb85-5e8a-46af-9cac-d4df0c5eb16a-kube-api-access-z4zsb\") pod \"2851fb85-5e8a-46af-9cac-d4df0c5eb16a\" (UID: \"2851fb85-5e8a-46af-9cac-d4df0c5eb16a\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.534242 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npnmp\" (UniqueName: \"kubernetes.io/projected/f70f2f55-ae76-4f8a-95a4-49933695ff6b-kube-api-access-npnmp\") pod \"f70f2f55-ae76-4f8a-95a4-49933695ff6b\" (UID: \"f70f2f55-ae76-4f8a-95a4-49933695ff6b\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.534267 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmrv2\" (UniqueName: \"kubernetes.io/projected/6c5d01ab-b923-4829-9b10-6ad9010216eb-kube-api-access-tmrv2\") pod \"6c5d01ab-b923-4829-9b10-6ad9010216eb\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.534335 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c5d01ab-b923-4829-9b10-6ad9010216eb-logs" (OuterVolumeSpecName: "logs") pod "6c5d01ab-b923-4829-9b10-6ad9010216eb" (UID: "6c5d01ab-b923-4829-9b10-6ad9010216eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.534367 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-scripts\") pod \"6c5d01ab-b923-4829-9b10-6ad9010216eb\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.534487 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-combined-ca-bundle\") pod \"6c5d01ab-b923-4829-9b10-6ad9010216eb\" (UID: \"6c5d01ab-b923-4829-9b10-6ad9010216eb\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.534956 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c5d01ab-b923-4829-9b10-6ad9010216eb-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.541792 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-69f6cbfd5c-82mhn" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.545916 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f70f2f55-ae76-4f8a-95a4-49933695ff6b-kube-api-access-npnmp" (OuterVolumeSpecName: "kube-api-access-npnmp") pod "f70f2f55-ae76-4f8a-95a4-49933695ff6b" (UID: "f70f2f55-ae76-4f8a-95a4-49933695ff6b"). InnerVolumeSpecName "kube-api-access-npnmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.548109 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2851fb85-5e8a-46af-9cac-d4df0c5eb16a-kube-api-access-z4zsb" (OuterVolumeSpecName: "kube-api-access-z4zsb") pod "2851fb85-5e8a-46af-9cac-d4df0c5eb16a" (UID: "2851fb85-5e8a-46af-9cac-d4df0c5eb16a"). InnerVolumeSpecName "kube-api-access-z4zsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.552716 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c5d01ab-b923-4829-9b10-6ad9010216eb-kube-api-access-tmrv2" (OuterVolumeSpecName: "kube-api-access-tmrv2") pod "6c5d01ab-b923-4829-9b10-6ad9010216eb" (UID: "6c5d01ab-b923-4829-9b10-6ad9010216eb"). InnerVolumeSpecName "kube-api-access-tmrv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.560311 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-scripts" (OuterVolumeSpecName: "scripts") pod "6c5d01ab-b923-4829-9b10-6ad9010216eb" (UID: "6c5d01ab-b923-4829-9b10-6ad9010216eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.582513 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.593354 4834 scope.go:117] "RemoveContainer" containerID="8572778b6d4762619545e6de6bc9dc967110b451eb1a7a8a406e0ababce1ccb3" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.594110 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-config-data" (OuterVolumeSpecName: "config-data") pod "6c5d01ab-b923-4829-9b10-6ad9010216eb" (UID: "6c5d01ab-b923-4829-9b10-6ad9010216eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.628581 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01c52817-5a8e-480e-847c-eceaba519de6" path="/var/lib/kubelet/pods/01c52817-5a8e-480e-847c-eceaba519de6/volumes" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.629200 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c5d01ab-b923-4829-9b10-6ad9010216eb" (UID: "6c5d01ab-b923-4829-9b10-6ad9010216eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.633723 4834 scope.go:117] "RemoveContainer" containerID="bf01ad0d24b4fd615e20d17e37792f6e0fc245dd9fd365f332d5c5a9ba9492d0" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.637806 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12663254-035f-4057-b178-2dc4d42db157" path="/var/lib/kubelet/pods/12663254-035f-4057-b178-2dc4d42db157/volumes" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.640465 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-scripts\") pod \"f1c297e1-ec55-4113-a87d-7813a27c03d9\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.640532 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-combined-ca-bundle\") pod \"f1c297e1-ec55-4113-a87d-7813a27c03d9\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.640555 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-public-tls-certs\") pod \"f1c297e1-ec55-4113-a87d-7813a27c03d9\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.640590 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-config-data\") pod \"f1c297e1-ec55-4113-a87d-7813a27c03d9\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.640614 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1c297e1-ec55-4113-a87d-7813a27c03d9-logs\") pod \"f1c297e1-ec55-4113-a87d-7813a27c03d9\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.640631 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"f1c297e1-ec55-4113-a87d-7813a27c03d9\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.640678 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7p6c\" (UniqueName: \"kubernetes.io/projected/a20774f5-74f4-4f7f-9f33-b4b55585cb7d-kube-api-access-n7p6c\") pod \"a20774f5-74f4-4f7f-9f33-b4b55585cb7d\" (UID: \"a20774f5-74f4-4f7f-9f33-b4b55585cb7d\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.640711 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f1c297e1-ec55-4113-a87d-7813a27c03d9-httpd-run\") pod \"f1c297e1-ec55-4113-a87d-7813a27c03d9\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.640755 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjmgb\" (UniqueName: \"kubernetes.io/projected/f1c297e1-ec55-4113-a87d-7813a27c03d9-kube-api-access-xjmgb\") pod \"f1c297e1-ec55-4113-a87d-7813a27c03d9\" (UID: \"f1c297e1-ec55-4113-a87d-7813a27c03d9\") " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.641163 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.641193 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4zsb\" (UniqueName: \"kubernetes.io/projected/2851fb85-5e8a-46af-9cac-d4df0c5eb16a-kube-api-access-z4zsb\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.641203 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npnmp\" (UniqueName: \"kubernetes.io/projected/f70f2f55-ae76-4f8a-95a4-49933695ff6b-kube-api-access-npnmp\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.641212 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmrv2\" (UniqueName: \"kubernetes.io/projected/6c5d01ab-b923-4829-9b10-6ad9010216eb-kube-api-access-tmrv2\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.641221 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.641229 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.643349 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="399e90af-d658-4f62-8efe-3c26b5f717ef" path="/var/lib/kubelet/pods/399e90af-d658-4f62-8efe-3c26b5f717ef/volumes" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.643503 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6c5d01ab-b923-4829-9b10-6ad9010216eb" (UID: "6c5d01ab-b923-4829-9b10-6ad9010216eb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.643923 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="500dac52-9a64-4864-bb56-dddfe8b82e88" path="/var/lib/kubelet/pods/500dac52-9a64-4864-bb56-dddfe8b82e88/volumes" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.644432 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5efb48ec-b3dd-4646-b813-007d45c94ad2" path="/var/lib/kubelet/pods/5efb48ec-b3dd-4646-b813-007d45c94ad2/volumes" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.645357 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-scripts" (OuterVolumeSpecName: "scripts") pod "f1c297e1-ec55-4113-a87d-7813a27c03d9" (UID: "f1c297e1-ec55-4113-a87d-7813a27c03d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.645834 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1c297e1-ec55-4113-a87d-7813a27c03d9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f1c297e1-ec55-4113-a87d-7813a27c03d9" (UID: "f1c297e1-ec55-4113-a87d-7813a27c03d9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.646092 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a20774f5-74f4-4f7f-9f33-b4b55585cb7d-kube-api-access-n7p6c" (OuterVolumeSpecName: "kube-api-access-n7p6c") pod "a20774f5-74f4-4f7f-9f33-b4b55585cb7d" (UID: "a20774f5-74f4-4f7f-9f33-b4b55585cb7d"). InnerVolumeSpecName "kube-api-access-n7p6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.648428 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1c297e1-ec55-4113-a87d-7813a27c03d9-kube-api-access-xjmgb" (OuterVolumeSpecName: "kube-api-access-xjmgb") pod "f1c297e1-ec55-4113-a87d-7813a27c03d9" (UID: "f1c297e1-ec55-4113-a87d-7813a27c03d9"). InnerVolumeSpecName "kube-api-access-xjmgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.651775 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1c297e1-ec55-4113-a87d-7813a27c03d9-logs" (OuterVolumeSpecName: "logs") pod "f1c297e1-ec55-4113-a87d-7813a27c03d9" (UID: "f1c297e1-ec55-4113-a87d-7813a27c03d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.652212 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="894c1f04-42d4-43de-a34a-19200ceec426" path="/var/lib/kubelet/pods/894c1f04-42d4-43de-a34a-19200ceec426/volumes" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.653078 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7f6b5b1-dc89-433a-987a-5c122cfcd241" path="/var/lib/kubelet/pods/c7f6b5b1-dc89-433a-987a-5c122cfcd241/volumes" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.654171 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c87464d1-1c42-43d1-b273-e25acf2895cd" path="/var/lib/kubelet/pods/c87464d1-1c42-43d1-b273-e25acf2895cd/volumes" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.654636 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caf766d3-49fe-4a20-bf0e-405ccca15c69" path="/var/lib/kubelet/pods/caf766d3-49fe-4a20-bf0e-405ccca15c69/volumes" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.654821 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "f1c297e1-ec55-4113-a87d-7813a27c03d9" (UID: "f1c297e1-ec55-4113-a87d-7813a27c03d9"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.669903 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-69f6cbfd5c-82mhn"] Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.669939 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-69f6cbfd5c-82mhn"] Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.677372 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6c5d01ab-b923-4829-9b10-6ad9010216eb" (UID: "6c5d01ab-b923-4829-9b10-6ad9010216eb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.681556 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1c297e1-ec55-4113-a87d-7813a27c03d9" (UID: "f1c297e1-ec55-4113-a87d-7813a27c03d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.725215 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-config-data" (OuterVolumeSpecName: "config-data") pod "f1c297e1-ec55-4113-a87d-7813a27c03d9" (UID: "f1c297e1-ec55-4113-a87d-7813a27c03d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.728903 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f1c297e1-ec55-4113-a87d-7813a27c03d9" (UID: "f1c297e1-ec55-4113-a87d-7813a27c03d9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.742292 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.742324 4834 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.742336 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.742350 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1c297e1-ec55-4113-a87d-7813a27c03d9-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.742375 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.742388 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7p6c\" (UniqueName: \"kubernetes.io/projected/a20774f5-74f4-4f7f-9f33-b4b55585cb7d-kube-api-access-n7p6c\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.742401 4834 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f1c297e1-ec55-4113-a87d-7813a27c03d9-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.742413 4834 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.742426 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjmgb\" (UniqueName: \"kubernetes.io/projected/f1c297e1-ec55-4113-a87d-7813a27c03d9-kube-api-access-xjmgb\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.742477 4834 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5d01ab-b923-4829-9b10-6ad9010216eb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.742492 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1c297e1-ec55-4113-a87d-7813a27c03d9-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.755356 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="168e9a74-197a-4210-a553-7162c2f521af" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.106:11211: connect: connection refused" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.763854 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.845358 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:53 crc kubenswrapper[4834]: I1008 22:46:53.991839 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-55b44744c4-z2p4d"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.008248 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.008416 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.016759 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-55b44744c4-z2p4d"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.036095 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance9716-account-delete-6zvrv"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.040199 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.050861 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-scripts\") pod \"37143980-a3f8-4398-a1d7-0f8189fb5366\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.050916 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"37143980-a3f8-4398-a1d7-0f8189fb5366\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.050933 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fede876-b44b-40e1-8c56-9c35d2528e37-etc-machine-id\") pod \"2fede876-b44b-40e1-8c56-9c35d2528e37\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.050968 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-internal-tls-certs\") pod \"37143980-a3f8-4398-a1d7-0f8189fb5366\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.050989 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqlz4\" (UniqueName: \"kubernetes.io/projected/37143980-a3f8-4398-a1d7-0f8189fb5366-kube-api-access-vqlz4\") pod \"37143980-a3f8-4398-a1d7-0f8189fb5366\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051021 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-scripts\") pod \"2fede876-b44b-40e1-8c56-9c35d2528e37\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051038 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-internal-tls-certs\") pod \"2fede876-b44b-40e1-8c56-9c35d2528e37\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051065 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-combined-ca-bundle\") pod \"37143980-a3f8-4398-a1d7-0f8189fb5366\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051092 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fede876-b44b-40e1-8c56-9c35d2528e37-logs\") pod \"2fede876-b44b-40e1-8c56-9c35d2528e37\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051132 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-public-tls-certs\") pod \"a163bab0-7bd2-4272-a1f0-cd0090eed141\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051169 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk6th\" (UniqueName: \"kubernetes.io/projected/a163bab0-7bd2-4272-a1f0-cd0090eed141-kube-api-access-fk6th\") pod \"a163bab0-7bd2-4272-a1f0-cd0090eed141\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051186 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-config-data\") pod \"a163bab0-7bd2-4272-a1f0-cd0090eed141\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051204 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-config-data-custom\") pod \"a163bab0-7bd2-4272-a1f0-cd0090eed141\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051221 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-internal-tls-certs\") pod \"a163bab0-7bd2-4272-a1f0-cd0090eed141\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051238 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-config-data-custom\") pod \"2fede876-b44b-40e1-8c56-9c35d2528e37\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051252 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37143980-a3f8-4398-a1d7-0f8189fb5366-httpd-run\") pod \"37143980-a3f8-4398-a1d7-0f8189fb5366\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051274 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxlmm\" (UniqueName: \"kubernetes.io/projected/2fede876-b44b-40e1-8c56-9c35d2528e37-kube-api-access-fxlmm\") pod \"2fede876-b44b-40e1-8c56-9c35d2528e37\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051340 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a163bab0-7bd2-4272-a1f0-cd0090eed141-logs\") pod \"a163bab0-7bd2-4272-a1f0-cd0090eed141\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051362 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-combined-ca-bundle\") pod \"a163bab0-7bd2-4272-a1f0-cd0090eed141\" (UID: \"a163bab0-7bd2-4272-a1f0-cd0090eed141\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051380 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-public-tls-certs\") pod \"2fede876-b44b-40e1-8c56-9c35d2528e37\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051411 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37143980-a3f8-4398-a1d7-0f8189fb5366-logs\") pod \"37143980-a3f8-4398-a1d7-0f8189fb5366\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051439 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-config-data\") pod \"2fede876-b44b-40e1-8c56-9c35d2528e37\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051468 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-config-data\") pod \"37143980-a3f8-4398-a1d7-0f8189fb5366\" (UID: \"37143980-a3f8-4398-a1d7-0f8189fb5366\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.051491 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-combined-ca-bundle\") pod \"2fede876-b44b-40e1-8c56-9c35d2528e37\" (UID: \"2fede876-b44b-40e1-8c56-9c35d2528e37\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.057024 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a163bab0-7bd2-4272-a1f0-cd0090eed141-logs" (OuterVolumeSpecName: "logs") pod "a163bab0-7bd2-4272-a1f0-cd0090eed141" (UID: "a163bab0-7bd2-4272-a1f0-cd0090eed141"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.057230 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fede876-b44b-40e1-8c56-9c35d2528e37-logs" (OuterVolumeSpecName: "logs") pod "2fede876-b44b-40e1-8c56-9c35d2528e37" (UID: "2fede876-b44b-40e1-8c56-9c35d2528e37"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.058232 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2fede876-b44b-40e1-8c56-9c35d2528e37-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2fede876-b44b-40e1-8c56-9c35d2528e37" (UID: "2fede876-b44b-40e1-8c56-9c35d2528e37"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.059963 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37143980-a3f8-4398-a1d7-0f8189fb5366-logs" (OuterVolumeSpecName: "logs") pod "37143980-a3f8-4398-a1d7-0f8189fb5366" (UID: "37143980-a3f8-4398-a1d7-0f8189fb5366"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.062044 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-scripts" (OuterVolumeSpecName: "scripts") pod "37143980-a3f8-4398-a1d7-0f8189fb5366" (UID: "37143980-a3f8-4398-a1d7-0f8189fb5366"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.065224 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37143980-a3f8-4398-a1d7-0f8189fb5366-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "37143980-a3f8-4398-a1d7-0f8189fb5366" (UID: "37143980-a3f8-4398-a1d7-0f8189fb5366"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.107077 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fede876-b44b-40e1-8c56-9c35d2528e37-kube-api-access-fxlmm" (OuterVolumeSpecName: "kube-api-access-fxlmm") pod "2fede876-b44b-40e1-8c56-9c35d2528e37" (UID: "2fede876-b44b-40e1-8c56-9c35d2528e37"). InnerVolumeSpecName "kube-api-access-fxlmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.108567 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-scripts" (OuterVolumeSpecName: "scripts") pod "2fede876-b44b-40e1-8c56-9c35d2528e37" (UID: "2fede876-b44b-40e1-8c56-9c35d2528e37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.109017 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "37143980-a3f8-4398-a1d7-0f8189fb5366" (UID: "37143980-a3f8-4398-a1d7-0f8189fb5366"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.109044 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a163bab0-7bd2-4272-a1f0-cd0090eed141-kube-api-access-fk6th" (OuterVolumeSpecName: "kube-api-access-fk6th") pod "a163bab0-7bd2-4272-a1f0-cd0090eed141" (UID: "a163bab0-7bd2-4272-a1f0-cd0090eed141"). InnerVolumeSpecName "kube-api-access-fk6th". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.109385 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2fede876-b44b-40e1-8c56-9c35d2528e37" (UID: "2fede876-b44b-40e1-8c56-9c35d2528e37"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.110089 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37143980-a3f8-4398-a1d7-0f8189fb5366-kube-api-access-vqlz4" (OuterVolumeSpecName: "kube-api-access-vqlz4") pod "37143980-a3f8-4398-a1d7-0f8189fb5366" (UID: "37143980-a3f8-4398-a1d7-0f8189fb5366"). InnerVolumeSpecName "kube-api-access-vqlz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.110976 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a163bab0-7bd2-4272-a1f0-cd0090eed141" (UID: "a163bab0-7bd2-4272-a1f0-cd0090eed141"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.111490 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance9716-account-delete-6zvrv"] Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.137659 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4 is running failed: container process not found" containerID="8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.137763 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fede876-b44b-40e1-8c56-9c35d2528e37" (UID: "2fede876-b44b-40e1-8c56-9c35d2528e37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.138299 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4 is running failed: container process not found" containerID="8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.138627 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4 is running failed: container process not found" containerID="8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.138658 4834 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="81641859-a43e-4d35-bc09-f541277c77da" containerName="nova-cell0-conductor-conductor" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.152817 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.152844 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fede876-b44b-40e1-8c56-9c35d2528e37-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.152854 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk6th\" (UniqueName: \"kubernetes.io/projected/a163bab0-7bd2-4272-a1f0-cd0090eed141-kube-api-access-fk6th\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.152863 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.152872 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.152881 4834 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/37143980-a3f8-4398-a1d7-0f8189fb5366-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.152890 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxlmm\" (UniqueName: \"kubernetes.io/projected/2fede876-b44b-40e1-8c56-9c35d2528e37-kube-api-access-fxlmm\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.152898 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a163bab0-7bd2-4272-a1f0-cd0090eed141-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.152906 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37143980-a3f8-4398-a1d7-0f8189fb5366-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.152913 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.152921 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.152940 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.152948 4834 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fede876-b44b-40e1-8c56-9c35d2528e37-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.152957 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqlz4\" (UniqueName: \"kubernetes.io/projected/37143980-a3f8-4398-a1d7-0f8189fb5366-kube-api-access-vqlz4\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.187263 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a163bab0-7bd2-4272-a1f0-cd0090eed141" (UID: "a163bab0-7bd2-4272-a1f0-cd0090eed141"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.201534 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.202285 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.202643 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.202666 4834 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovsdb-server" Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.203714 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.216475 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.219637 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.219740 4834 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovs-vswitchd" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.225309 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.231259 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.231409 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2fede876-b44b-40e1-8c56-9c35d2528e37" (UID: "2fede876-b44b-40e1-8c56-9c35d2528e37"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.239468 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.245172 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.245347 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37143980-a3f8-4398-a1d7-0f8189fb5366" (UID: "37143980-a3f8-4398-a1d7-0f8189fb5366"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253092 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-combined-ca-bundle\") pod \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253129 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-config-data\") pod \"6122ff69-d6fb-4002-8679-80b826faf58f\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253159 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6122ff69-d6fb-4002-8679-80b826faf58f-logs\") pod \"6122ff69-d6fb-4002-8679-80b826faf58f\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253243 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrjtr\" (UniqueName: \"kubernetes.io/projected/81641859-a43e-4d35-bc09-f541277c77da-kube-api-access-qrjtr\") pod \"81641859-a43e-4d35-bc09-f541277c77da\" (UID: \"81641859-a43e-4d35-bc09-f541277c77da\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253259 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-combined-ca-bundle\") pod \"6122ff69-d6fb-4002-8679-80b826faf58f\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253291 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkzjq\" (UniqueName: \"kubernetes.io/projected/6122ff69-d6fb-4002-8679-80b826faf58f-kube-api-access-mkzjq\") pod \"6122ff69-d6fb-4002-8679-80b826faf58f\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253345 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-nova-metadata-tls-certs\") pod \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253371 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-kube-state-metrics-tls-certs\") pod \"8b73c297-7a02-46b4-88bf-30b239655df8\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253432 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-combined-ca-bundle\") pod \"8b73c297-7a02-46b4-88bf-30b239655df8\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253480 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f052dbd-010a-456f-af57-0b6b2f6e70ad-logs\") pod \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253496 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81641859-a43e-4d35-bc09-f541277c77da-config-data\") pod \"81641859-a43e-4d35-bc09-f541277c77da\" (UID: \"81641859-a43e-4d35-bc09-f541277c77da\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253516 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjghd\" (UniqueName: \"kubernetes.io/projected/8b73c297-7a02-46b4-88bf-30b239655df8-kube-api-access-pjghd\") pod \"8b73c297-7a02-46b4-88bf-30b239655df8\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253553 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-kube-state-metrics-tls-config\") pod \"8b73c297-7a02-46b4-88bf-30b239655df8\" (UID: \"8b73c297-7a02-46b4-88bf-30b239655df8\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253568 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81641859-a43e-4d35-bc09-f541277c77da-combined-ca-bundle\") pod \"81641859-a43e-4d35-bc09-f541277c77da\" (UID: \"81641859-a43e-4d35-bc09-f541277c77da\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253582 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-config-data\") pod \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253599 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-config-data-custom\") pod \"6122ff69-d6fb-4002-8679-80b826faf58f\" (UID: \"6122ff69-d6fb-4002-8679-80b826faf58f\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253645 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-258sm\" (UniqueName: \"kubernetes.io/projected/2f052dbd-010a-456f-af57-0b6b2f6e70ad-kube-api-access-258sm\") pod \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\" (UID: \"2f052dbd-010a-456f-af57-0b6b2f6e70ad\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253899 4834 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253911 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.253934 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.257385 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6122ff69-d6fb-4002-8679-80b826faf58f-logs" (OuterVolumeSpecName: "logs") pod "6122ff69-d6fb-4002-8679-80b826faf58f" (UID: "6122ff69-d6fb-4002-8679-80b826faf58f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.260273 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f052dbd-010a-456f-af57-0b6b2f6e70ad-logs" (OuterVolumeSpecName: "logs") pod "2f052dbd-010a-456f-af57-0b6b2f6e70ad" (UID: "2f052dbd-010a-456f-af57-0b6b2f6e70ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.294236 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b73c297-7a02-46b4-88bf-30b239655df8-kube-api-access-pjghd" (OuterVolumeSpecName: "kube-api-access-pjghd") pod "8b73c297-7a02-46b4-88bf-30b239655df8" (UID: "8b73c297-7a02-46b4-88bf-30b239655df8"). InnerVolumeSpecName "kube-api-access-pjghd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.295019 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f052dbd-010a-456f-af57-0b6b2f6e70ad-kube-api-access-258sm" (OuterVolumeSpecName: "kube-api-access-258sm") pod "2f052dbd-010a-456f-af57-0b6b2f6e70ad" (UID: "2f052dbd-010a-456f-af57-0b6b2f6e70ad"). InnerVolumeSpecName "kube-api-access-258sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.301757 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81641859-a43e-4d35-bc09-f541277c77da-kube-api-access-qrjtr" (OuterVolumeSpecName: "kube-api-access-qrjtr") pod "81641859-a43e-4d35-bc09-f541277c77da" (UID: "81641859-a43e-4d35-bc09-f541277c77da"). InnerVolumeSpecName "kube-api-access-qrjtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.301881 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6122ff69-d6fb-4002-8679-80b826faf58f-kube-api-access-mkzjq" (OuterVolumeSpecName: "kube-api-access-mkzjq") pod "6122ff69-d6fb-4002-8679-80b826faf58f" (UID: "6122ff69-d6fb-4002-8679-80b826faf58f"). InnerVolumeSpecName "kube-api-access-mkzjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.303681 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6122ff69-d6fb-4002-8679-80b826faf58f" (UID: "6122ff69-d6fb-4002-8679-80b826faf58f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.321356 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "37143980-a3f8-4398-a1d7-0f8189fb5366" (UID: "37143980-a3f8-4398-a1d7-0f8189fb5366"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.332715 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-config-data" (OuterVolumeSpecName: "config-data") pod "a163bab0-7bd2-4272-a1f0-cd0090eed141" (UID: "a163bab0-7bd2-4272-a1f0-cd0090eed141"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.334529 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a163bab0-7bd2-4272-a1f0-cd0090eed141" (UID: "a163bab0-7bd2-4272-a1f0-cd0090eed141"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.354262 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.355824 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f052dbd-010a-456f-af57-0b6b2f6e70ad-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.355867 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjghd\" (UniqueName: \"kubernetes.io/projected/8b73c297-7a02-46b4-88bf-30b239655df8-kube-api-access-pjghd\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.355882 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.355893 4834 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.355903 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-258sm\" (UniqueName: \"kubernetes.io/projected/2f052dbd-010a-456f-af57-0b6b2f6e70ad-kube-api-access-258sm\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.355911 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6122ff69-d6fb-4002-8679-80b826faf58f-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.355921 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrjtr\" (UniqueName: \"kubernetes.io/projected/81641859-a43e-4d35-bc09-f541277c77da-kube-api-access-qrjtr\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.355930 4834 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.355938 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.355946 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkzjq\" (UniqueName: \"kubernetes.io/projected/6122ff69-d6fb-4002-8679-80b826faf58f-kube-api-access-mkzjq\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.374817 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.385526 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-config-data" (OuterVolumeSpecName: "config-data") pod "2fede876-b44b-40e1-8c56-9c35d2528e37" (UID: "2fede876-b44b-40e1-8c56-9c35d2528e37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.402562 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81641859-a43e-4d35-bc09-f541277c77da-config-data" (OuterVolumeSpecName: "config-data") pod "81641859-a43e-4d35-bc09-f541277c77da" (UID: "81641859-a43e-4d35-bc09-f541277c77da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.405604 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f052dbd-010a-456f-af57-0b6b2f6e70ad" (UID: "2f052dbd-010a-456f-af57-0b6b2f6e70ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.407235 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2fede876-b44b-40e1-8c56-9c35d2528e37" (UID: "2fede876-b44b-40e1-8c56-9c35d2528e37"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.420184 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-config-data" (OuterVolumeSpecName: "config-data") pod "37143980-a3f8-4398-a1d7-0f8189fb5366" (UID: "37143980-a3f8-4398-a1d7-0f8189fb5366"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.423701 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b73c297-7a02-46b4-88bf-30b239655df8" (UID: "8b73c297-7a02-46b4-88bf-30b239655df8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.432264 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "8b73c297-7a02-46b4-88bf-30b239655df8" (UID: "8b73c297-7a02-46b4-88bf-30b239655df8"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.432286 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81641859-a43e-4d35-bc09-f541277c77da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81641859-a43e-4d35-bc09-f541277c77da" (UID: "81641859-a43e-4d35-bc09-f541277c77da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.432287 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a163bab0-7bd2-4272-a1f0-cd0090eed141" (UID: "a163bab0-7bd2-4272-a1f0-cd0090eed141"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.434744 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.446926 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-config-data" (OuterVolumeSpecName: "config-data") pod "2f052dbd-010a-456f-af57-0b6b2f6e70ad" (UID: "2f052dbd-010a-456f-af57-0b6b2f6e70ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.452670 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6122ff69-d6fb-4002-8679-80b826faf58f" (UID: "6122ff69-d6fb-4002-8679-80b826faf58f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.453215 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-config-data" (OuterVolumeSpecName: "config-data") pod "6122ff69-d6fb-4002-8679-80b826faf58f" (UID: "6122ff69-d6fb-4002-8679-80b826faf58f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.457502 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92213f20-28bf-4fe1-b547-6867677b0049-combined-ca-bundle\") pod \"92213f20-28bf-4fe1-b547-6867677b0049\" (UID: \"92213f20-28bf-4fe1-b547-6867677b0049\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.457577 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/168e9a74-197a-4210-a553-7162c2f521af-memcached-tls-certs\") pod \"168e9a74-197a-4210-a553-7162c2f521af\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.457641 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnqk2\" (UniqueName: \"kubernetes.io/projected/168e9a74-197a-4210-a553-7162c2f521af-kube-api-access-pnqk2\") pod \"168e9a74-197a-4210-a553-7162c2f521af\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.457806 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168e9a74-197a-4210-a553-7162c2f521af-combined-ca-bundle\") pod \"168e9a74-197a-4210-a553-7162c2f521af\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.457921 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92213f20-28bf-4fe1-b547-6867677b0049-config-data\") pod \"92213f20-28bf-4fe1-b547-6867677b0049\" (UID: \"92213f20-28bf-4fe1-b547-6867677b0049\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.457990 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/168e9a74-197a-4210-a553-7162c2f521af-config-data\") pod \"168e9a74-197a-4210-a553-7162c2f521af\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.459356 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/168e9a74-197a-4210-a553-7162c2f521af-kolla-config\") pod \"168e9a74-197a-4210-a553-7162c2f521af\" (UID: \"168e9a74-197a-4210-a553-7162c2f521af\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.459414 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlfjg\" (UniqueName: \"kubernetes.io/projected/92213f20-28bf-4fe1-b547-6867677b0049-kube-api-access-dlfjg\") pod \"92213f20-28bf-4fe1-b547-6867677b0049\" (UID: \"92213f20-28bf-4fe1-b547-6867677b0049\") " Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.460102 4834 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a163bab0-7bd2-4272-a1f0-cd0090eed141-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.460123 4834 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.460138 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fede876-b44b-40e1-8c56-9c35d2528e37-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.460235 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.460246 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37143980-a3f8-4398-a1d7-0f8189fb5366-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.460260 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81641859-a43e-4d35-bc09-f541277c77da-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.460273 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.460285 4834 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.460299 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81641859-a43e-4d35-bc09-f541277c77da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.460314 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.460329 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.460342 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.460355 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6122ff69-d6fb-4002-8679-80b826faf58f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.463457 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/168e9a74-197a-4210-a553-7162c2f521af-config-data" (OuterVolumeSpecName: "config-data") pod "168e9a74-197a-4210-a553-7162c2f521af" (UID: "168e9a74-197a-4210-a553-7162c2f521af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.464032 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/168e9a74-197a-4210-a553-7162c2f521af-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "168e9a74-197a-4210-a553-7162c2f521af" (UID: "168e9a74-197a-4210-a553-7162c2f521af"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.465289 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92213f20-28bf-4fe1-b547-6867677b0049-kube-api-access-dlfjg" (OuterVolumeSpecName: "kube-api-access-dlfjg") pod "92213f20-28bf-4fe1-b547-6867677b0049" (UID: "92213f20-28bf-4fe1-b547-6867677b0049"). InnerVolumeSpecName "kube-api-access-dlfjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.475788 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168e9a74-197a-4210-a553-7162c2f521af-kube-api-access-pnqk2" (OuterVolumeSpecName: "kube-api-access-pnqk2") pod "168e9a74-197a-4210-a553-7162c2f521af" (UID: "168e9a74-197a-4210-a553-7162c2f521af"). InnerVolumeSpecName "kube-api-access-pnqk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.484528 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92213f20-28bf-4fe1-b547-6867677b0049-config-data" (OuterVolumeSpecName: "config-data") pod "92213f20-28bf-4fe1-b547-6867677b0049" (UID: "92213f20-28bf-4fe1-b547-6867677b0049"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.491641 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/168e9a74-197a-4210-a553-7162c2f521af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "168e9a74-197a-4210-a553-7162c2f521af" (UID: "168e9a74-197a-4210-a553-7162c2f521af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.509962 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "8b73c297-7a02-46b4-88bf-30b239655df8" (UID: "8b73c297-7a02-46b4-88bf-30b239655df8"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.511643 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2f052dbd-010a-456f-af57-0b6b2f6e70ad" (UID: "2f052dbd-010a-456f-af57-0b6b2f6e70ad"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.516966 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92213f20-28bf-4fe1-b547-6867677b0049-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92213f20-28bf-4fe1-b547-6867677b0049" (UID: "92213f20-28bf-4fe1-b547-6867677b0049"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.554885 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fede876-b44b-40e1-8c56-9c35d2528e37","Type":"ContainerDied","Data":"10b85818c43ca0acb9f142c960fa5273d09c7f7893f29c91852fbdf61fa330c8"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.554960 4834 scope.go:117] "RemoveContainer" containerID="577329ac5e86fe36588abe9f509038517d2bba0f083da6b616ddb28e28603822" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.554910 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.557759 4834 generic.go:334] "Generic (PLEG): container finished" podID="81641859-a43e-4d35-bc09-f541277c77da" containerID="8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4" exitCode=0 Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.557847 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"81641859-a43e-4d35-bc09-f541277c77da","Type":"ContainerDied","Data":"8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.557870 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"81641859-a43e-4d35-bc09-f541277c77da","Type":"ContainerDied","Data":"09fa05b1cd6cbb741f7bb1e530e34dc61992d68a3a175d0ea1ffa982f2c8cd8f"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.557918 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.562256 4834 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/168e9a74-197a-4210-a553-7162c2f521af-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.562383 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlfjg\" (UniqueName: \"kubernetes.io/projected/92213f20-28bf-4fe1-b547-6867677b0049-kube-api-access-dlfjg\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.562447 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92213f20-28bf-4fe1-b547-6867677b0049-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.562556 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnqk2\" (UniqueName: \"kubernetes.io/projected/168e9a74-197a-4210-a553-7162c2f521af-kube-api-access-pnqk2\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.562612 4834 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f052dbd-010a-456f-af57-0b6b2f6e70ad-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.562663 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168e9a74-197a-4210-a553-7162c2f521af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.562901 4834 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b73c297-7a02-46b4-88bf-30b239655df8-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.562953 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92213f20-28bf-4fe1-b547-6867677b0049-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.563029 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/168e9a74-197a-4210-a553-7162c2f521af-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.564277 4834 generic.go:334] "Generic (PLEG): container finished" podID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" containerID="076875605f67f5f773c3929bec2e085e03b401b24922943ccf94d778e73bbdde" exitCode=0 Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.564338 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f052dbd-010a-456f-af57-0b6b2f6e70ad","Type":"ContainerDied","Data":"076875605f67f5f773c3929bec2e085e03b401b24922943ccf94d778e73bbdde"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.564362 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2f052dbd-010a-456f-af57-0b6b2f6e70ad","Type":"ContainerDied","Data":"7acb661a811c85fadcd6283573fe932fc01bc74c02d8f572f503b03d2d25bacb"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.564411 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.573158 4834 generic.go:334] "Generic (PLEG): container finished" podID="168e9a74-197a-4210-a553-7162c2f521af" containerID="cc2cc2df524ac30c517a6da54a2a306187af9e9dd24a45e3d7df0f7dc25a73b1" exitCode=0 Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.573204 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"168e9a74-197a-4210-a553-7162c2f521af","Type":"ContainerDied","Data":"cc2cc2df524ac30c517a6da54a2a306187af9e9dd24a45e3d7df0f7dc25a73b1"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.573223 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"168e9a74-197a-4210-a553-7162c2f521af","Type":"ContainerDied","Data":"9e459cb24d84ef24a6424c3430e6389583e448564dbe71e65be8b7d58c80e8e9"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.573265 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.575081 4834 generic.go:334] "Generic (PLEG): container finished" podID="6122ff69-d6fb-4002-8679-80b826faf58f" containerID="f4112eae344e33c2ed1e518a1ec200f2e5a71736e028860090c20eae2cea0902" exitCode=0 Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.575154 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" event={"ID":"6122ff69-d6fb-4002-8679-80b826faf58f","Type":"ContainerDied","Data":"f4112eae344e33c2ed1e518a1ec200f2e5a71736e028860090c20eae2cea0902"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.575328 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" event={"ID":"6122ff69-d6fb-4002-8679-80b826faf58f","Type":"ContainerDied","Data":"79458c5df48162ca6a34cac62d0eb56dd5e41f478eaa8a4c8e2da867f7740d30"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.575210 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-688bb4b854-srcv6" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.592007 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"37143980-a3f8-4398-a1d7-0f8189fb5366","Type":"ContainerDied","Data":"6ef5a229e7d07f00f199279a38c613785b06ec9ce380dac470d56e2748fe1027"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.592046 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.609139 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.610190 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/168e9a74-197a-4210-a553-7162c2f521af-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "168e9a74-197a-4210-a553-7162c2f521af" (UID: "168e9a74-197a-4210-a553-7162c2f521af"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.610435 4834 scope.go:117] "RemoveContainer" containerID="69bc22ed35e6a76565637096b43955534881f7b3f617bfa8208087f9e3cad9e3" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.612826 4834 generic.go:334] "Generic (PLEG): container finished" podID="92213f20-28bf-4fe1-b547-6867677b0049" containerID="16bcb11a1ca18e052f73bc017145f17afd2ff7f2d3c4ac4c7024f6ffa30d6469" exitCode=0 Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.612888 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"92213f20-28bf-4fe1-b547-6867677b0049","Type":"ContainerDied","Data":"16bcb11a1ca18e052f73bc017145f17afd2ff7f2d3c4ac4c7024f6ffa30d6469"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.612911 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"92213f20-28bf-4fe1-b547-6867677b0049","Type":"ContainerDied","Data":"4831f1b98054aff999105ef4a4c7451e953cf981dee2c6797895c5d9555bc81d"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.612968 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.617186 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8b73c297-7a02-46b4-88bf-30b239655df8","Type":"ContainerDied","Data":"aefb7fc7aa9622954b805fb5d9c58e984502b890dd2f99f69843b5ea26b5e1ff"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.617282 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.619406 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.628861 4834 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell1c373-account-delete-cq7ht" secret="" err="secret \"galera-openstack-cell1-dockercfg-xbdnd\" not found" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.628923 4834 scope.go:117] "RemoveContainer" containerID="41e88b8f67a6a8a7ddfed83ceafbca7ec43e29b70ff238cd03e0147a41767dbc" Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.629284 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=novacell1c373-account-delete-cq7ht_openstack(b0f06117-94bf-4e56-b5f7-e83eda8ee811)\"" pod="openstack/novacell1c373-account-delete-cq7ht" podUID="b0f06117-94bf-4e56-b5f7-e83eda8ee811" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.633105 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00e05134-e159-40fe-9c63-a0dc406c8dee/ovn-northd/0.log" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.633199 4834 generic.go:334] "Generic (PLEG): container finished" podID="00e05134-e159-40fe-9c63-a0dc406c8dee" containerID="befa7386c77d07b3be61cbc85442566df26dcee9bc664cf8da1c08dd1f7c92d7" exitCode=139 Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.633349 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00e05134-e159-40fe-9c63-a0dc406c8dee","Type":"ContainerDied","Data":"befa7386c77d07b3be61cbc85442566df26dcee9bc664cf8da1c08dd1f7c92d7"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.638609 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.648842 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.654693 4834 generic.go:334] "Generic (PLEG): container finished" podID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerID="1032f4edc7088e996cbec692eba1d575d4cb67e5112593147d72e8f6a8aadd15" exitCode=0 Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.654785 4834 generic.go:334] "Generic (PLEG): container finished" podID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerID="647473fbbcb274c0e2ff66facfea42e5a01db54b90650c2eb3272770f97d4568" exitCode=0 Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.654797 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2e3be8-465e-4b20-9586-387cd8d9ca67","Type":"ContainerDied","Data":"1032f4edc7088e996cbec692eba1d575d4cb67e5112593147d72e8f6a8aadd15"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.654855 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2e3be8-465e-4b20-9586-387cd8d9ca67","Type":"ContainerDied","Data":"647473fbbcb274c0e2ff66facfea42e5a01db54b90650c2eb3272770f97d4568"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.663883 4834 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/168e9a74-197a-4210-a553-7162c2f521af-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.666253 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.667836 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9f9b7d4b4-cr99t" event={"ID":"a163bab0-7bd2-4272-a1f0-cd0090eed141","Type":"ContainerDied","Data":"b90057f3d2386a8e0be598e7cb620f1af13ad5d1bd77db7b16ee6b1aecff30d7"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.667849 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9f9b7d4b4-cr99t" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.671113 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron3518-account-delete-sdqt9" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.671522 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.671520 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f1c297e1-ec55-4113-a87d-7813a27c03d9","Type":"ContainerDied","Data":"0e92c2f238ed646df6d52d13fbf2e27b633698504120bf398e47a90cb5276806"} Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.671740 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapif0f4-account-delete-b5tm7" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.678824 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.691830 4834 scope.go:117] "RemoveContainer" containerID="8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.699354 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-688bb4b854-srcv6"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.710269 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-688bb4b854-srcv6"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.718576 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.727600 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.735233 4834 scope.go:117] "RemoveContainer" containerID="8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4" Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.735731 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4\": container with ID starting with 8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4 not found: ID does not exist" containerID="8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.735773 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4"} err="failed to get container status \"8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4\": rpc error: code = NotFound desc = could not find container \"8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4\": container with ID starting with 8793967b6bca4f0f436df66010537a92f04ab9a25d7b54596c5af15753bc35e4 not found: ID does not exist" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.735797 4834 scope.go:117] "RemoveContainer" containerID="076875605f67f5f773c3929bec2e085e03b401b24922943ccf94d778e73bbdde" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.743682 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.750751 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.758400 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.768183 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.773404 4834 scope.go:117] "RemoveContainer" containerID="e5fbf3dfab52fddd6705504ae85baf057b103dea50affd654e156b76c58f623f" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.778489 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron3518-account-delete-sdqt9"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.786038 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron3518-account-delete-sdqt9"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.791940 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-9f9b7d4b4-cr99t"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.797946 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-9f9b7d4b4-cr99t"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.803972 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapif0f4-account-delete-b5tm7"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.809272 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapif0f4-account-delete-b5tm7"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.822465 4834 scope.go:117] "RemoveContainer" containerID="076875605f67f5f773c3929bec2e085e03b401b24922943ccf94d778e73bbdde" Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.823617 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"076875605f67f5f773c3929bec2e085e03b401b24922943ccf94d778e73bbdde\": container with ID starting with 076875605f67f5f773c3929bec2e085e03b401b24922943ccf94d778e73bbdde not found: ID does not exist" containerID="076875605f67f5f773c3929bec2e085e03b401b24922943ccf94d778e73bbdde" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.823647 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"076875605f67f5f773c3929bec2e085e03b401b24922943ccf94d778e73bbdde"} err="failed to get container status \"076875605f67f5f773c3929bec2e085e03b401b24922943ccf94d778e73bbdde\": rpc error: code = NotFound desc = could not find container \"076875605f67f5f773c3929bec2e085e03b401b24922943ccf94d778e73bbdde\": container with ID starting with 076875605f67f5f773c3929bec2e085e03b401b24922943ccf94d778e73bbdde not found: ID does not exist" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.823678 4834 scope.go:117] "RemoveContainer" containerID="e5fbf3dfab52fddd6705504ae85baf057b103dea50affd654e156b76c58f623f" Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.825687 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5fbf3dfab52fddd6705504ae85baf057b103dea50affd654e156b76c58f623f\": container with ID starting with e5fbf3dfab52fddd6705504ae85baf057b103dea50affd654e156b76c58f623f not found: ID does not exist" containerID="e5fbf3dfab52fddd6705504ae85baf057b103dea50affd654e156b76c58f623f" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.825721 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5fbf3dfab52fddd6705504ae85baf057b103dea50affd654e156b76c58f623f"} err="failed to get container status \"e5fbf3dfab52fddd6705504ae85baf057b103dea50affd654e156b76c58f623f\": rpc error: code = NotFound desc = could not find container \"e5fbf3dfab52fddd6705504ae85baf057b103dea50affd654e156b76c58f623f\": container with ID starting with e5fbf3dfab52fddd6705504ae85baf057b103dea50affd654e156b76c58f623f not found: ID does not exist" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.825736 4834 scope.go:117] "RemoveContainer" containerID="cc2cc2df524ac30c517a6da54a2a306187af9e9dd24a45e3d7df0f7dc25a73b1" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.829395 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.834918 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.853901 4834 scope.go:117] "RemoveContainer" containerID="cc2cc2df524ac30c517a6da54a2a306187af9e9dd24a45e3d7df0f7dc25a73b1" Oct 08 22:46:54 crc kubenswrapper[4834]: E1008 22:46:54.854614 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc2cc2df524ac30c517a6da54a2a306187af9e9dd24a45e3d7df0f7dc25a73b1\": container with ID starting with cc2cc2df524ac30c517a6da54a2a306187af9e9dd24a45e3d7df0f7dc25a73b1 not found: ID does not exist" containerID="cc2cc2df524ac30c517a6da54a2a306187af9e9dd24a45e3d7df0f7dc25a73b1" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.854667 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2cc2df524ac30c517a6da54a2a306187af9e9dd24a45e3d7df0f7dc25a73b1"} err="failed to get container status \"cc2cc2df524ac30c517a6da54a2a306187af9e9dd24a45e3d7df0f7dc25a73b1\": rpc error: code = NotFound desc = could not find container \"cc2cc2df524ac30c517a6da54a2a306187af9e9dd24a45e3d7df0f7dc25a73b1\": container with ID starting with cc2cc2df524ac30c517a6da54a2a306187af9e9dd24a45e3d7df0f7dc25a73b1 not found: ID does not exist" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.854695 4834 scope.go:117] "RemoveContainer" containerID="f4112eae344e33c2ed1e518a1ec200f2e5a71736e028860090c20eae2cea0902" Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.974563 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.978375 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 08 22:46:54 crc kubenswrapper[4834]: I1008 22:46:54.979747 4834 scope.go:117] "RemoveContainer" containerID="e0129efbb3686c4ac85c467d7e795d114a6de5a26a31044df0216a19b3300c6a" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.000934 4834 scope.go:117] "RemoveContainer" containerID="f4112eae344e33c2ed1e518a1ec200f2e5a71736e028860090c20eae2cea0902" Oct 08 22:46:55 crc kubenswrapper[4834]: E1008 22:46:55.001389 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4112eae344e33c2ed1e518a1ec200f2e5a71736e028860090c20eae2cea0902\": container with ID starting with f4112eae344e33c2ed1e518a1ec200f2e5a71736e028860090c20eae2cea0902 not found: ID does not exist" containerID="f4112eae344e33c2ed1e518a1ec200f2e5a71736e028860090c20eae2cea0902" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.001426 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4112eae344e33c2ed1e518a1ec200f2e5a71736e028860090c20eae2cea0902"} err="failed to get container status \"f4112eae344e33c2ed1e518a1ec200f2e5a71736e028860090c20eae2cea0902\": rpc error: code = NotFound desc = could not find container \"f4112eae344e33c2ed1e518a1ec200f2e5a71736e028860090c20eae2cea0902\": container with ID starting with f4112eae344e33c2ed1e518a1ec200f2e5a71736e028860090c20eae2cea0902 not found: ID does not exist" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.001452 4834 scope.go:117] "RemoveContainer" containerID="e0129efbb3686c4ac85c467d7e795d114a6de5a26a31044df0216a19b3300c6a" Oct 08 22:46:55 crc kubenswrapper[4834]: E1008 22:46:55.001770 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0129efbb3686c4ac85c467d7e795d114a6de5a26a31044df0216a19b3300c6a\": container with ID starting with e0129efbb3686c4ac85c467d7e795d114a6de5a26a31044df0216a19b3300c6a not found: ID does not exist" containerID="e0129efbb3686c4ac85c467d7e795d114a6de5a26a31044df0216a19b3300c6a" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.001816 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0129efbb3686c4ac85c467d7e795d114a6de5a26a31044df0216a19b3300c6a"} err="failed to get container status \"e0129efbb3686c4ac85c467d7e795d114a6de5a26a31044df0216a19b3300c6a\": rpc error: code = NotFound desc = could not find container \"e0129efbb3686c4ac85c467d7e795d114a6de5a26a31044df0216a19b3300c6a\": container with ID starting with e0129efbb3686c4ac85c467d7e795d114a6de5a26a31044df0216a19b3300c6a not found: ID does not exist" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.001835 4834 scope.go:117] "RemoveContainer" containerID="c3f0ee497d77bf25a33bb1a3381c77e479b17a2eac69da1fa447bfb0e183a0e4" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.024377 4834 scope.go:117] "RemoveContainer" containerID="97de4609bf212569f354a8db48765bc289647b7d03ecd1100224cb7a89ad47c3" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.072380 4834 scope.go:117] "RemoveContainer" containerID="16bcb11a1ca18e052f73bc017145f17afd2ff7f2d3c4ac4c7024f6ffa30d6469" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.095431 4834 scope.go:117] "RemoveContainer" containerID="16bcb11a1ca18e052f73bc017145f17afd2ff7f2d3c4ac4c7024f6ffa30d6469" Oct 08 22:46:55 crc kubenswrapper[4834]: E1008 22:46:55.095805 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16bcb11a1ca18e052f73bc017145f17afd2ff7f2d3c4ac4c7024f6ffa30d6469\": container with ID starting with 16bcb11a1ca18e052f73bc017145f17afd2ff7f2d3c4ac4c7024f6ffa30d6469 not found: ID does not exist" containerID="16bcb11a1ca18e052f73bc017145f17afd2ff7f2d3c4ac4c7024f6ffa30d6469" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.095838 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16bcb11a1ca18e052f73bc017145f17afd2ff7f2d3c4ac4c7024f6ffa30d6469"} err="failed to get container status \"16bcb11a1ca18e052f73bc017145f17afd2ff7f2d3c4ac4c7024f6ffa30d6469\": rpc error: code = NotFound desc = could not find container \"16bcb11a1ca18e052f73bc017145f17afd2ff7f2d3c4ac4c7024f6ffa30d6469\": container with ID starting with 16bcb11a1ca18e052f73bc017145f17afd2ff7f2d3c4ac4c7024f6ffa30d6469 not found: ID does not exist" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.095858 4834 scope.go:117] "RemoveContainer" containerID="e4758aa91899dd39f5a62ab08ecb06485dc5e662e93954ee17513d9d7aaa9349" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.116165 4834 scope.go:117] "RemoveContainer" containerID="fd63b549a1986038c721e698592fdda290ebfb583fc9734ad9ac998ea85e14d3" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.137061 4834 scope.go:117] "RemoveContainer" containerID="ee03e22162c15519f33753c495b20abc4d67e8ca443ee3f90594ba5404a7a3a2" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.160848 4834 scope.go:117] "RemoveContainer" containerID="7af54c4f4905381853f354524b17a405dd2c9d5ab3d098a361ea339c61a15d5d" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.186672 4834 scope.go:117] "RemoveContainer" containerID="354d0197ed5738529f4ce14ae4d167d9d0a781f57eca46b2a301712e02875868" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.245424 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00e05134-e159-40fe-9c63-a0dc406c8dee/ovn-northd/0.log" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.245486 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 22:46:55 crc kubenswrapper[4834]: E1008 22:46:55.252410 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:46:55 crc kubenswrapper[4834]: E1008 22:46:55.254465 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:46:55 crc kubenswrapper[4834]: E1008 22:46:55.255673 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 08 22:46:55 crc kubenswrapper[4834]: E1008 22:46:55.255732 4834 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f120f0d7-ba00-4502-a2f3-7c619440887a" containerName="nova-scheduler-scheduler" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.376615 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkm97\" (UniqueName: \"kubernetes.io/projected/00e05134-e159-40fe-9c63-a0dc406c8dee-kube-api-access-pkm97\") pod \"00e05134-e159-40fe-9c63-a0dc406c8dee\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.376704 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00e05134-e159-40fe-9c63-a0dc406c8dee-ovn-rundir\") pod \"00e05134-e159-40fe-9c63-a0dc406c8dee\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.376801 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00e05134-e159-40fe-9c63-a0dc406c8dee-config\") pod \"00e05134-e159-40fe-9c63-a0dc406c8dee\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.376842 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-metrics-certs-tls-certs\") pod \"00e05134-e159-40fe-9c63-a0dc406c8dee\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.376932 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-ovn-northd-tls-certs\") pod \"00e05134-e159-40fe-9c63-a0dc406c8dee\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.377005 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e05134-e159-40fe-9c63-a0dc406c8dee-scripts\") pod \"00e05134-e159-40fe-9c63-a0dc406c8dee\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.377027 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-combined-ca-bundle\") pod \"00e05134-e159-40fe-9c63-a0dc406c8dee\" (UID: \"00e05134-e159-40fe-9c63-a0dc406c8dee\") " Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.377783 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00e05134-e159-40fe-9c63-a0dc406c8dee-config" (OuterVolumeSpecName: "config") pod "00e05134-e159-40fe-9c63-a0dc406c8dee" (UID: "00e05134-e159-40fe-9c63-a0dc406c8dee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.379948 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00e05134-e159-40fe-9c63-a0dc406c8dee-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "00e05134-e159-40fe-9c63-a0dc406c8dee" (UID: "00e05134-e159-40fe-9c63-a0dc406c8dee"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.379983 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00e05134-e159-40fe-9c63-a0dc406c8dee-scripts" (OuterVolumeSpecName: "scripts") pod "00e05134-e159-40fe-9c63-a0dc406c8dee" (UID: "00e05134-e159-40fe-9c63-a0dc406c8dee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.383730 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00e05134-e159-40fe-9c63-a0dc406c8dee-kube-api-access-pkm97" (OuterVolumeSpecName: "kube-api-access-pkm97") pod "00e05134-e159-40fe-9c63-a0dc406c8dee" (UID: "00e05134-e159-40fe-9c63-a0dc406c8dee"). InnerVolumeSpecName "kube-api-access-pkm97". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.433641 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00e05134-e159-40fe-9c63-a0dc406c8dee" (UID: "00e05134-e159-40fe-9c63-a0dc406c8dee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.453371 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "00e05134-e159-40fe-9c63-a0dc406c8dee" (UID: "00e05134-e159-40fe-9c63-a0dc406c8dee"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.484517 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00e05134-e159-40fe-9c63-a0dc406c8dee-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.484556 4834 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.484571 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e05134-e159-40fe-9c63-a0dc406c8dee-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.484582 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.484594 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkm97\" (UniqueName: \"kubernetes.io/projected/00e05134-e159-40fe-9c63-a0dc406c8dee-kube-api-access-pkm97\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.484606 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/00e05134-e159-40fe-9c63-a0dc406c8dee-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.495173 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "00e05134-e159-40fe-9c63-a0dc406c8dee" (UID: "00e05134-e159-40fe-9c63-a0dc406c8dee"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.570134 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="168e9a74-197a-4210-a553-7162c2f521af" path="/var/lib/kubelet/pods/168e9a74-197a-4210-a553-7162c2f521af/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.570617 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2851fb85-5e8a-46af-9cac-d4df0c5eb16a" path="/var/lib/kubelet/pods/2851fb85-5e8a-46af-9cac-d4df0c5eb16a/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.571123 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" path="/var/lib/kubelet/pods/2f052dbd-010a-456f-af57-0b6b2f6e70ad/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.572209 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fede876-b44b-40e1-8c56-9c35d2528e37" path="/var/lib/kubelet/pods/2fede876-b44b-40e1-8c56-9c35d2528e37/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.573034 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37143980-a3f8-4398-a1d7-0f8189fb5366" path="/var/lib/kubelet/pods/37143980-a3f8-4398-a1d7-0f8189fb5366/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.573655 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6122ff69-d6fb-4002-8679-80b826faf58f" path="/var/lib/kubelet/pods/6122ff69-d6fb-4002-8679-80b826faf58f/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.574797 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c5d01ab-b923-4829-9b10-6ad9010216eb" path="/var/lib/kubelet/pods/6c5d01ab-b923-4829-9b10-6ad9010216eb/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.575375 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81641859-a43e-4d35-bc09-f541277c77da" path="/var/lib/kubelet/pods/81641859-a43e-4d35-bc09-f541277c77da/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.575906 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b73c297-7a02-46b4-88bf-30b239655df8" path="/var/lib/kubelet/pods/8b73c297-7a02-46b4-88bf-30b239655df8/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.577280 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92213f20-28bf-4fe1-b547-6867677b0049" path="/var/lib/kubelet/pods/92213f20-28bf-4fe1-b547-6867677b0049/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.577797 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a163bab0-7bd2-4272-a1f0-cd0090eed141" path="/var/lib/kubelet/pods/a163bab0-7bd2-4272-a1f0-cd0090eed141/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.578298 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a20774f5-74f4-4f7f-9f33-b4b55585cb7d" path="/var/lib/kubelet/pods/a20774f5-74f4-4f7f-9f33-b4b55585cb7d/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.579385 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5aa1aef-afe2-4b70-9033-c62921f3d106" path="/var/lib/kubelet/pods/c5aa1aef-afe2-4b70-9033-c62921f3d106/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.579949 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1c297e1-ec55-4113-a87d-7813a27c03d9" path="/var/lib/kubelet/pods/f1c297e1-ec55-4113-a87d-7813a27c03d9/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.580676 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f70f2f55-ae76-4f8a-95a4-49933695ff6b" path="/var/lib/kubelet/pods/f70f2f55-ae76-4f8a-95a4-49933695ff6b/volumes" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.586443 4834 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e05134-e159-40fe-9c63-a0dc406c8dee-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.673676 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.687441 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-internal-tls-certs\") pod \"e4629ae3-d685-43c9-81fd-49e84abd427f\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.687504 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqhqk\" (UniqueName: \"kubernetes.io/projected/e4629ae3-d685-43c9-81fd-49e84abd427f-kube-api-access-fqhqk\") pod \"e4629ae3-d685-43c9-81fd-49e84abd427f\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.687528 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-combined-ca-bundle\") pod \"e4629ae3-d685-43c9-81fd-49e84abd427f\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.687567 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4629ae3-d685-43c9-81fd-49e84abd427f-logs\") pod \"e4629ae3-d685-43c9-81fd-49e84abd427f\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.687600 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-public-tls-certs\") pod \"e4629ae3-d685-43c9-81fd-49e84abd427f\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.687642 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-config-data\") pod \"e4629ae3-d685-43c9-81fd-49e84abd427f\" (UID: \"e4629ae3-d685-43c9-81fd-49e84abd427f\") " Oct 08 22:46:55 crc kubenswrapper[4834]: E1008 22:46:55.687993 4834 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 08 22:46:55 crc kubenswrapper[4834]: E1008 22:46:55.688049 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data podName:08a7721f-38a1-4a82-88ed-6f70290b5a6d nodeName:}" failed. No retries permitted until 2025-10-08 22:47:03.688033596 +0000 UTC m=+1431.510918342 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data") pod "rabbitmq-server-0" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d") : configmap "rabbitmq-config-data" not found Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.689601 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4629ae3-d685-43c9-81fd-49e84abd427f-logs" (OuterVolumeSpecName: "logs") pod "e4629ae3-d685-43c9-81fd-49e84abd427f" (UID: "e4629ae3-d685-43c9-81fd-49e84abd427f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.692660 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4629ae3-d685-43c9-81fd-49e84abd427f-kube-api-access-fqhqk" (OuterVolumeSpecName: "kube-api-access-fqhqk") pod "e4629ae3-d685-43c9-81fd-49e84abd427f" (UID: "e4629ae3-d685-43c9-81fd-49e84abd427f"). InnerVolumeSpecName "kube-api-access-fqhqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.696042 4834 generic.go:334] "Generic (PLEG): container finished" podID="e4629ae3-d685-43c9-81fd-49e84abd427f" containerID="91c252fcdb0a95093ccf0b43533d67a7d99adc2858eb55d2f60c85cf2cf2cdfb" exitCode=0 Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.696120 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e4629ae3-d685-43c9-81fd-49e84abd427f","Type":"ContainerDied","Data":"91c252fcdb0a95093ccf0b43533d67a7d99adc2858eb55d2f60c85cf2cf2cdfb"} Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.696160 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e4629ae3-d685-43c9-81fd-49e84abd427f","Type":"ContainerDied","Data":"f3fc1ed14f7e07d05b56b95383b8614931412f15377cf0d2af9d197fc464ce56"} Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.696177 4834 scope.go:117] "RemoveContainer" containerID="91c252fcdb0a95093ccf0b43533d67a7d99adc2858eb55d2f60c85cf2cf2cdfb" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.696273 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.714921 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_00e05134-e159-40fe-9c63-a0dc406c8dee/ovn-northd/0.log" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.714979 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"00e05134-e159-40fe-9c63-a0dc406c8dee","Type":"ContainerDied","Data":"ebc156307ae1e0f44c84bb59a1cd9048a78bdbb163f2d1a956d0ff4a051a33c8"} Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.715056 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.744272 4834 generic.go:334] "Generic (PLEG): container finished" podID="34aacb58-3b8d-466d-9b71-e7098b95fe8e" containerID="3c24af656d20cb96d210268f8f068f5cf9e967d712c1e772384d7063e6db6c03" exitCode=0 Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.744614 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"34aacb58-3b8d-466d-9b71-e7098b95fe8e","Type":"ContainerDied","Data":"3c24af656d20cb96d210268f8f068f5cf9e967d712c1e772384d7063e6db6c03"} Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.749682 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4629ae3-d685-43c9-81fd-49e84abd427f" (UID: "e4629ae3-d685-43c9-81fd-49e84abd427f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.753848 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.770619 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.770920 4834 scope.go:117] "RemoveContainer" containerID="1aa647034ef8640a17c6a15eef2e518f048d1fa10c9a9aee36b78bdc01cd48b4" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.779444 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e4629ae3-d685-43c9-81fd-49e84abd427f" (UID: "e4629ae3-d685-43c9-81fd-49e84abd427f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.781341 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-config-data" (OuterVolumeSpecName: "config-data") pod "e4629ae3-d685-43c9-81fd-49e84abd427f" (UID: "e4629ae3-d685-43c9-81fd-49e84abd427f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.781429 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e4629ae3-d685-43c9-81fd-49e84abd427f" (UID: "e4629ae3-d685-43c9-81fd-49e84abd427f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.790839 4834 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.790874 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqhqk\" (UniqueName: \"kubernetes.io/projected/e4629ae3-d685-43c9-81fd-49e84abd427f-kube-api-access-fqhqk\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.790888 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.790899 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4629ae3-d685-43c9-81fd-49e84abd427f-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.790911 4834 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.790921 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4629ae3-d685-43c9-81fd-49e84abd427f-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.852736 4834 scope.go:117] "RemoveContainer" containerID="91c252fcdb0a95093ccf0b43533d67a7d99adc2858eb55d2f60c85cf2cf2cdfb" Oct 08 22:46:55 crc kubenswrapper[4834]: E1008 22:46:55.853328 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91c252fcdb0a95093ccf0b43533d67a7d99adc2858eb55d2f60c85cf2cf2cdfb\": container with ID starting with 91c252fcdb0a95093ccf0b43533d67a7d99adc2858eb55d2f60c85cf2cf2cdfb not found: ID does not exist" containerID="91c252fcdb0a95093ccf0b43533d67a7d99adc2858eb55d2f60c85cf2cf2cdfb" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.853362 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91c252fcdb0a95093ccf0b43533d67a7d99adc2858eb55d2f60c85cf2cf2cdfb"} err="failed to get container status \"91c252fcdb0a95093ccf0b43533d67a7d99adc2858eb55d2f60c85cf2cf2cdfb\": rpc error: code = NotFound desc = could not find container \"91c252fcdb0a95093ccf0b43533d67a7d99adc2858eb55d2f60c85cf2cf2cdfb\": container with ID starting with 91c252fcdb0a95093ccf0b43533d67a7d99adc2858eb55d2f60c85cf2cf2cdfb not found: ID does not exist" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.853385 4834 scope.go:117] "RemoveContainer" containerID="1aa647034ef8640a17c6a15eef2e518f048d1fa10c9a9aee36b78bdc01cd48b4" Oct 08 22:46:55 crc kubenswrapper[4834]: E1008 22:46:55.853738 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aa647034ef8640a17c6a15eef2e518f048d1fa10c9a9aee36b78bdc01cd48b4\": container with ID starting with 1aa647034ef8640a17c6a15eef2e518f048d1fa10c9a9aee36b78bdc01cd48b4 not found: ID does not exist" containerID="1aa647034ef8640a17c6a15eef2e518f048d1fa10c9a9aee36b78bdc01cd48b4" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.853775 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aa647034ef8640a17c6a15eef2e518f048d1fa10c9a9aee36b78bdc01cd48b4"} err="failed to get container status \"1aa647034ef8640a17c6a15eef2e518f048d1fa10c9a9aee36b78bdc01cd48b4\": rpc error: code = NotFound desc = could not find container \"1aa647034ef8640a17c6a15eef2e518f048d1fa10c9a9aee36b78bdc01cd48b4\": container with ID starting with 1aa647034ef8640a17c6a15eef2e518f048d1fa10c9a9aee36b78bdc01cd48b4 not found: ID does not exist" Oct 08 22:46:55 crc kubenswrapper[4834]: I1008 22:46:55.853799 4834 scope.go:117] "RemoveContainer" containerID="ee6702ece47fd3dad3f711249016d49520a142737dfe65f64635bcd1579089db" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.010417 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.014693 4834 scope.go:117] "RemoveContainer" containerID="befa7386c77d07b3be61cbc85442566df26dcee9bc664cf8da1c08dd1f7c92d7" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.064460 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.064513 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.096545 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-secrets\") pod \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.096589 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-combined-ca-bundle\") pod \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.096610 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.096636 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjvsh\" (UniqueName: \"kubernetes.io/projected/34aacb58-3b8d-466d-9b71-e7098b95fe8e-kube-api-access-bjvsh\") pod \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.096658 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-operator-scripts\") pod \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.096687 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/34aacb58-3b8d-466d-9b71-e7098b95fe8e-config-data-generated\") pod \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.096705 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-config-data-default\") pod \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.096732 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-kolla-config\") pod \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.096751 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-galera-tls-certs\") pod \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\" (UID: \"34aacb58-3b8d-466d-9b71-e7098b95fe8e\") " Oct 08 22:46:56 crc kubenswrapper[4834]: E1008 22:46:56.097043 4834 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 08 22:46:56 crc kubenswrapper[4834]: E1008 22:46:56.097089 4834 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data podName:9809d14f-10d2-479f-94d9-5b3ae7f49e7b nodeName:}" failed. No retries permitted until 2025-10-08 22:47:04.097076055 +0000 UTC m=+1431.919960801 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data") pod "rabbitmq-cell1-server-0" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b") : configmap "rabbitmq-cell1-config-data" not found Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.097942 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34aacb58-3b8d-466d-9b71-e7098b95fe8e-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "34aacb58-3b8d-466d-9b71-e7098b95fe8e" (UID: "34aacb58-3b8d-466d-9b71-e7098b95fe8e"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.099371 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "34aacb58-3b8d-466d-9b71-e7098b95fe8e" (UID: "34aacb58-3b8d-466d-9b71-e7098b95fe8e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.099385 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "34aacb58-3b8d-466d-9b71-e7098b95fe8e" (UID: "34aacb58-3b8d-466d-9b71-e7098b95fe8e"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.101620 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34aacb58-3b8d-466d-9b71-e7098b95fe8e" (UID: "34aacb58-3b8d-466d-9b71-e7098b95fe8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.103055 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-secrets" (OuterVolumeSpecName: "secrets") pod "34aacb58-3b8d-466d-9b71-e7098b95fe8e" (UID: "34aacb58-3b8d-466d-9b71-e7098b95fe8e"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.106294 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34aacb58-3b8d-466d-9b71-e7098b95fe8e-kube-api-access-bjvsh" (OuterVolumeSpecName: "kube-api-access-bjvsh") pod "34aacb58-3b8d-466d-9b71-e7098b95fe8e" (UID: "34aacb58-3b8d-466d-9b71-e7098b95fe8e"). InnerVolumeSpecName "kube-api-access-bjvsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.109125 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "34aacb58-3b8d-466d-9b71-e7098b95fe8e" (UID: "34aacb58-3b8d-466d-9b71-e7098b95fe8e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.117460 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1c373-account-delete-cq7ht" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.136340 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34aacb58-3b8d-466d-9b71-e7098b95fe8e" (UID: "34aacb58-3b8d-466d-9b71-e7098b95fe8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.158232 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "34aacb58-3b8d-466d-9b71-e7098b95fe8e" (UID: "34aacb58-3b8d-466d-9b71-e7098b95fe8e"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.198341 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dkcl\" (UniqueName: \"kubernetes.io/projected/b0f06117-94bf-4e56-b5f7-e83eda8ee811-kube-api-access-6dkcl\") pod \"b0f06117-94bf-4e56-b5f7-e83eda8ee811\" (UID: \"b0f06117-94bf-4e56-b5f7-e83eda8ee811\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.198635 4834 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-secrets\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.198651 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.198676 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.198686 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjvsh\" (UniqueName: \"kubernetes.io/projected/34aacb58-3b8d-466d-9b71-e7098b95fe8e-kube-api-access-bjvsh\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.198695 4834 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.198705 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/34aacb58-3b8d-466d-9b71-e7098b95fe8e-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.198714 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.198722 4834 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/34aacb58-3b8d-466d-9b71-e7098b95fe8e-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.198730 4834 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/34aacb58-3b8d-466d-9b71-e7098b95fe8e-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.202204 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0f06117-94bf-4e56-b5f7-e83eda8ee811-kube-api-access-6dkcl" (OuterVolumeSpecName: "kube-api-access-6dkcl") pod "b0f06117-94bf-4e56-b5f7-e83eda8ee811" (UID: "b0f06117-94bf-4e56-b5f7-e83eda8ee811"). InnerVolumeSpecName "kube-api-access-6dkcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.220319 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.224526 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.299653 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.299784 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-confd\") pod \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.299857 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-server-conf\") pod \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.299969 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-tls\") pod \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.300000 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvbvj\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-kube-api-access-fvbvj\") pod \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.300076 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data\") pod \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.300108 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08a7721f-38a1-4a82-88ed-6f70290b5a6d-erlang-cookie-secret\") pod \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.300254 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-plugins-conf\") pod \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.300319 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-plugins\") pod \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.300376 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-erlang-cookie\") pod \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.300407 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08a7721f-38a1-4a82-88ed-6f70290b5a6d-pod-info\") pod \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\" (UID: \"08a7721f-38a1-4a82-88ed-6f70290b5a6d\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.300704 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dkcl\" (UniqueName: \"kubernetes.io/projected/b0f06117-94bf-4e56-b5f7-e83eda8ee811-kube-api-access-6dkcl\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.300719 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.301314 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "08a7721f-38a1-4a82-88ed-6f70290b5a6d" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.301537 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "08a7721f-38a1-4a82-88ed-6f70290b5a6d" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.301691 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "08a7721f-38a1-4a82-88ed-6f70290b5a6d" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.306401 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a7721f-38a1-4a82-88ed-6f70290b5a6d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "08a7721f-38a1-4a82-88ed-6f70290b5a6d" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.306422 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/08a7721f-38a1-4a82-88ed-6f70290b5a6d-pod-info" (OuterVolumeSpecName: "pod-info") pod "08a7721f-38a1-4a82-88ed-6f70290b5a6d" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.306410 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "08a7721f-38a1-4a82-88ed-6f70290b5a6d" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.309256 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "08a7721f-38a1-4a82-88ed-6f70290b5a6d" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.309374 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-kube-api-access-fvbvj" (OuterVolumeSpecName: "kube-api-access-fvbvj") pod "08a7721f-38a1-4a82-88ed-6f70290b5a6d" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d"). InnerVolumeSpecName "kube-api-access-fvbvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.333846 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data" (OuterVolumeSpecName: "config-data") pod "08a7721f-38a1-4a82-88ed-6f70290b5a6d" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.357654 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-server-conf" (OuterVolumeSpecName: "server-conf") pod "08a7721f-38a1-4a82-88ed-6f70290b5a6d" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.380739 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "08a7721f-38a1-4a82-88ed-6f70290b5a6d" (UID: "08a7721f-38a1-4a82-88ed-6f70290b5a6d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.402830 4834 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/08a7721f-38a1-4a82-88ed-6f70290b5a6d-pod-info\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.402878 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.402891 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.402904 4834 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-server-conf\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.402915 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.402927 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvbvj\" (UniqueName: \"kubernetes.io/projected/08a7721f-38a1-4a82-88ed-6f70290b5a6d-kube-api-access-fvbvj\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.402939 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.402951 4834 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/08a7721f-38a1-4a82-88ed-6f70290b5a6d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.402961 4834 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/08a7721f-38a1-4a82-88ed-6f70290b5a6d-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.402972 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.402984 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/08a7721f-38a1-4a82-88ed-6f70290b5a6d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.426762 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.505264 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.657351 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.769093 4834 generic.go:334] "Generic (PLEG): container finished" podID="9809d14f-10d2-479f-94d9-5b3ae7f49e7b" containerID="5d9514045c935fa894162fd45b39ec094f493d6cb728ac60cb7260b0bb6a23fa" exitCode=0 Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.769276 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.769262 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9809d14f-10d2-479f-94d9-5b3ae7f49e7b","Type":"ContainerDied","Data":"5d9514045c935fa894162fd45b39ec094f493d6cb728ac60cb7260b0bb6a23fa"} Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.770291 4834 scope.go:117] "RemoveContainer" containerID="5d9514045c935fa894162fd45b39ec094f493d6cb728ac60cb7260b0bb6a23fa" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.784087 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.770116 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9809d14f-10d2-479f-94d9-5b3ae7f49e7b","Type":"ContainerDied","Data":"f3a013dc36f7527e1a3f86374b0da4447d80bc2da2ad93f227efdfbfd8b071d3"} Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.784234 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"34aacb58-3b8d-466d-9b71-e7098b95fe8e","Type":"ContainerDied","Data":"ee16b9bf45efb2157cb54df419fb5da30fb8110333fdcccea76fdbba81208ad3"} Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.792686 4834 generic.go:334] "Generic (PLEG): container finished" podID="788f2464-05b4-4c9a-bd83-6c1365740166" containerID="1604636fd334c22a16f8d495172a59b50e33f1ed2b35af73ba1fc55f9dd3f3c9" exitCode=0 Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.792735 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-57cf4d469b-9sj2l" event={"ID":"788f2464-05b4-4c9a-bd83-6c1365740166","Type":"ContainerDied","Data":"1604636fd334c22a16f8d495172a59b50e33f1ed2b35af73ba1fc55f9dd3f3c9"} Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.799227 4834 scope.go:117] "RemoveContainer" containerID="f498e75a1c76bc1070028f242d044a99535c4ab8ce469c8317c198d7520564ee" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.807372 4834 generic.go:334] "Generic (PLEG): container finished" podID="08a7721f-38a1-4a82-88ed-6f70290b5a6d" containerID="d45fc994c0ded46eacd3c34ef9d7bd611a887f7680ed3782c64c1af97de14c9b" exitCode=0 Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.807412 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"08a7721f-38a1-4a82-88ed-6f70290b5a6d","Type":"ContainerDied","Data":"d45fc994c0ded46eacd3c34ef9d7bd611a887f7680ed3782c64c1af97de14c9b"} Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.807430 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"08a7721f-38a1-4a82-88ed-6f70290b5a6d","Type":"ContainerDied","Data":"2ef2b6b302292227b34f6c27277b8625c533333fc59fa4b246d88e8be45f12ad"} Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.807480 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.816295 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh2nl\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-kube-api-access-fh2nl\") pod \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.816347 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-confd\") pod \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.816368 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.816421 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-erlang-cookie-secret\") pod \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.816435 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-tls\") pod \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.816460 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-plugins-conf\") pod \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.816480 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-plugins\") pod \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.816498 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-server-conf\") pod \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.816535 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-erlang-cookie\") pod \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.816591 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-pod-info\") pod \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.816617 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data\") pod \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.818360 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9809d14f-10d2-479f-94d9-5b3ae7f49e7b" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.822745 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-kube-api-access-fh2nl" (OuterVolumeSpecName: "kube-api-access-fh2nl") pod "9809d14f-10d2-479f-94d9-5b3ae7f49e7b" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b"). InnerVolumeSpecName "kube-api-access-fh2nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.823106 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9809d14f-10d2-479f-94d9-5b3ae7f49e7b" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.823121 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9809d14f-10d2-479f-94d9-5b3ae7f49e7b" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.824376 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1c373-account-delete-cq7ht" event={"ID":"b0f06117-94bf-4e56-b5f7-e83eda8ee811","Type":"ContainerDied","Data":"fa3103a36b666b6735c3c7ffea80ec5d31ab7e50a3e99858c700840f29a675b3"} Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.824425 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1c373-account-delete-cq7ht" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.826865 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.832219 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "9809d14f-10d2-479f-94d9-5b3ae7f49e7b" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.847626 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-pod-info" (OuterVolumeSpecName: "pod-info") pod "9809d14f-10d2-479f-94d9-5b3ae7f49e7b" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.847673 4834 scope.go:117] "RemoveContainer" containerID="5d9514045c935fa894162fd45b39ec094f493d6cb728ac60cb7260b0bb6a23fa" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.861066 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9809d14f-10d2-479f-94d9-5b3ae7f49e7b" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.861741 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9809d14f-10d2-479f-94d9-5b3ae7f49e7b" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: E1008 22:46:56.864130 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d9514045c935fa894162fd45b39ec094f493d6cb728ac60cb7260b0bb6a23fa\": container with ID starting with 5d9514045c935fa894162fd45b39ec094f493d6cb728ac60cb7260b0bb6a23fa not found: ID does not exist" containerID="5d9514045c935fa894162fd45b39ec094f493d6cb728ac60cb7260b0bb6a23fa" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.864201 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d9514045c935fa894162fd45b39ec094f493d6cb728ac60cb7260b0bb6a23fa"} err="failed to get container status \"5d9514045c935fa894162fd45b39ec094f493d6cb728ac60cb7260b0bb6a23fa\": rpc error: code = NotFound desc = could not find container \"5d9514045c935fa894162fd45b39ec094f493d6cb728ac60cb7260b0bb6a23fa\": container with ID starting with 5d9514045c935fa894162fd45b39ec094f493d6cb728ac60cb7260b0bb6a23fa not found: ID does not exist" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.864229 4834 scope.go:117] "RemoveContainer" containerID="f498e75a1c76bc1070028f242d044a99535c4ab8ce469c8317c198d7520564ee" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.867378 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 08 22:46:56 crc kubenswrapper[4834]: E1008 22:46:56.869939 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f498e75a1c76bc1070028f242d044a99535c4ab8ce469c8317c198d7520564ee\": container with ID starting with f498e75a1c76bc1070028f242d044a99535c4ab8ce469c8317c198d7520564ee not found: ID does not exist" containerID="f498e75a1c76bc1070028f242d044a99535c4ab8ce469c8317c198d7520564ee" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.869993 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f498e75a1c76bc1070028f242d044a99535c4ab8ce469c8317c198d7520564ee"} err="failed to get container status \"f498e75a1c76bc1070028f242d044a99535c4ab8ce469c8317c198d7520564ee\": rpc error: code = NotFound desc = could not find container \"f498e75a1c76bc1070028f242d044a99535c4ab8ce469c8317c198d7520564ee\": container with ID starting with f498e75a1c76bc1070028f242d044a99535c4ab8ce469c8317c198d7520564ee not found: ID does not exist" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.870021 4834 scope.go:117] "RemoveContainer" containerID="3c24af656d20cb96d210268f8f068f5cf9e967d712c1e772384d7063e6db6c03" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.876493 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.882562 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.918110 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-server-conf" (OuterVolumeSpecName: "server-conf") pod "9809d14f-10d2-479f-94d9-5b3ae7f49e7b" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.918558 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-server-conf\") pod \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\" (UID: \"9809d14f-10d2-479f-94d9-5b3ae7f49e7b\") " Oct 08 22:46:56 crc kubenswrapper[4834]: W1008 22:46:56.918956 4834 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9809d14f-10d2-479f-94d9-5b3ae7f49e7b/volumes/kubernetes.io~configmap/server-conf Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.919028 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-server-conf" (OuterVolumeSpecName: "server-conf") pod "9809d14f-10d2-479f-94d9-5b3ae7f49e7b" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.920844 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.920920 4834 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.920981 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.921031 4834 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.921081 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.921324 4834 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-server-conf\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.921531 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.921598 4834 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-pod-info\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.921653 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh2nl\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-kube-api-access-fh2nl\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.941753 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data" (OuterVolumeSpecName: "config-data") pod "9809d14f-10d2-479f-94d9-5b3ae7f49e7b" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.945315 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 08 22:46:56 crc kubenswrapper[4834]: I1008 22:46:56.982592 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9809d14f-10d2-479f-94d9-5b3ae7f49e7b" (UID: "9809d14f-10d2-479f-94d9-5b3ae7f49e7b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.022909 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.022939 4834 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9809d14f-10d2-479f-94d9-5b3ae7f49e7b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.022949 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.233203 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.239409 4834 scope.go:117] "RemoveContainer" containerID="e01a7e96d33013eec69b7d341daec852f82c75bfd87353e3cb99da288feb48fd" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.245720 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1c373-account-delete-cq7ht"] Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.250477 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.253336 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell1c373-account-delete-cq7ht"] Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.302923 4834 scope.go:117] "RemoveContainer" containerID="d45fc994c0ded46eacd3c34ef9d7bd611a887f7680ed3782c64c1af97de14c9b" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.307642 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.313582 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.331285 4834 scope.go:117] "RemoveContainer" containerID="3aaaa7b4a3cb0c93a5b20371c77783e5b966e0007e153d3de4823bb1f71e3721" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.355198 4834 scope.go:117] "RemoveContainer" containerID="d45fc994c0ded46eacd3c34ef9d7bd611a887f7680ed3782c64c1af97de14c9b" Oct 08 22:46:57 crc kubenswrapper[4834]: E1008 22:46:57.361874 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d45fc994c0ded46eacd3c34ef9d7bd611a887f7680ed3782c64c1af97de14c9b\": container with ID starting with d45fc994c0ded46eacd3c34ef9d7bd611a887f7680ed3782c64c1af97de14c9b not found: ID does not exist" containerID="d45fc994c0ded46eacd3c34ef9d7bd611a887f7680ed3782c64c1af97de14c9b" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.361912 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d45fc994c0ded46eacd3c34ef9d7bd611a887f7680ed3782c64c1af97de14c9b"} err="failed to get container status \"d45fc994c0ded46eacd3c34ef9d7bd611a887f7680ed3782c64c1af97de14c9b\": rpc error: code = NotFound desc = could not find container \"d45fc994c0ded46eacd3c34ef9d7bd611a887f7680ed3782c64c1af97de14c9b\": container with ID starting with d45fc994c0ded46eacd3c34ef9d7bd611a887f7680ed3782c64c1af97de14c9b not found: ID does not exist" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.361937 4834 scope.go:117] "RemoveContainer" containerID="3aaaa7b4a3cb0c93a5b20371c77783e5b966e0007e153d3de4823bb1f71e3721" Oct 08 22:46:57 crc kubenswrapper[4834]: E1008 22:46:57.363320 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aaaa7b4a3cb0c93a5b20371c77783e5b966e0007e153d3de4823bb1f71e3721\": container with ID starting with 3aaaa7b4a3cb0c93a5b20371c77783e5b966e0007e153d3de4823bb1f71e3721 not found: ID does not exist" containerID="3aaaa7b4a3cb0c93a5b20371c77783e5b966e0007e153d3de4823bb1f71e3721" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.363341 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aaaa7b4a3cb0c93a5b20371c77783e5b966e0007e153d3de4823bb1f71e3721"} err="failed to get container status \"3aaaa7b4a3cb0c93a5b20371c77783e5b966e0007e153d3de4823bb1f71e3721\": rpc error: code = NotFound desc = could not find container \"3aaaa7b4a3cb0c93a5b20371c77783e5b966e0007e153d3de4823bb1f71e3721\": container with ID starting with 3aaaa7b4a3cb0c93a5b20371c77783e5b966e0007e153d3de4823bb1f71e3721 not found: ID does not exist" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.363356 4834 scope.go:117] "RemoveContainer" containerID="41e88b8f67a6a8a7ddfed83ceafbca7ec43e29b70ff238cd03e0147a41767dbc" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.393687 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.429600 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-sg-core-conf-yaml\") pod \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.429642 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f7d4f35-145c-4af9-9f4b-de8700877370-logs\") pod \"2f7d4f35-145c-4af9-9f4b-de8700877370\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.429663 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-combined-ca-bundle\") pod \"788f2464-05b4-4c9a-bd83-6c1365740166\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.429681 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2e3be8-465e-4b20-9586-387cd8d9ca67-log-httpd\") pod \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.429717 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-config-data\") pod \"788f2464-05b4-4c9a-bd83-6c1365740166\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.429747 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2f6g\" (UniqueName: \"kubernetes.io/projected/2f7d4f35-145c-4af9-9f4b-de8700877370-kube-api-access-p2f6g\") pod \"2f7d4f35-145c-4af9-9f4b-de8700877370\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.429764 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-internal-tls-certs\") pod \"788f2464-05b4-4c9a-bd83-6c1365740166\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.430317 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed2e3be8-465e-4b20-9586-387cd8d9ca67-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ed2e3be8-465e-4b20-9586-387cd8d9ca67" (UID: "ed2e3be8-465e-4b20-9586-387cd8d9ca67"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.430371 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-config-data-custom\") pod \"2f7d4f35-145c-4af9-9f4b-de8700877370\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.430405 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-ceilometer-tls-certs\") pod \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.430421 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-combined-ca-bundle\") pod \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.430459 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-fernet-keys\") pod \"788f2464-05b4-4c9a-bd83-6c1365740166\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.430519 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-combined-ca-bundle\") pod \"2f7d4f35-145c-4af9-9f4b-de8700877370\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.430922 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f7d4f35-145c-4af9-9f4b-de8700877370-logs" (OuterVolumeSpecName: "logs") pod "2f7d4f35-145c-4af9-9f4b-de8700877370" (UID: "2f7d4f35-145c-4af9-9f4b-de8700877370"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.431122 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-scripts\") pod \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.431168 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-public-tls-certs\") pod \"788f2464-05b4-4c9a-bd83-6c1365740166\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.431193 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-credential-keys\") pod \"788f2464-05b4-4c9a-bd83-6c1365740166\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.431216 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-config-data\") pod \"2f7d4f35-145c-4af9-9f4b-de8700877370\" (UID: \"2f7d4f35-145c-4af9-9f4b-de8700877370\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.431238 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-scripts\") pod \"788f2464-05b4-4c9a-bd83-6c1365740166\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.431274 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2e3be8-465e-4b20-9586-387cd8d9ca67-run-httpd\") pod \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.431293 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-config-data\") pod \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.431331 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfwmv\" (UniqueName: \"kubernetes.io/projected/ed2e3be8-465e-4b20-9586-387cd8d9ca67-kube-api-access-cfwmv\") pod \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\" (UID: \"ed2e3be8-465e-4b20-9586-387cd8d9ca67\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.431354 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68b9s\" (UniqueName: \"kubernetes.io/projected/788f2464-05b4-4c9a-bd83-6c1365740166-kube-api-access-68b9s\") pod \"788f2464-05b4-4c9a-bd83-6c1365740166\" (UID: \"788f2464-05b4-4c9a-bd83-6c1365740166\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.431689 4834 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2e3be8-465e-4b20-9586-387cd8d9ca67-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.434238 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed2e3be8-465e-4b20-9586-387cd8d9ca67-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ed2e3be8-465e-4b20-9586-387cd8d9ca67" (UID: "ed2e3be8-465e-4b20-9586-387cd8d9ca67"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.436221 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-scripts" (OuterVolumeSpecName: "scripts") pod "ed2e3be8-465e-4b20-9586-387cd8d9ca67" (UID: "ed2e3be8-465e-4b20-9586-387cd8d9ca67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.439362 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/788f2464-05b4-4c9a-bd83-6c1365740166-kube-api-access-68b9s" (OuterVolumeSpecName: "kube-api-access-68b9s") pod "788f2464-05b4-4c9a-bd83-6c1365740166" (UID: "788f2464-05b4-4c9a-bd83-6c1365740166"). InnerVolumeSpecName "kube-api-access-68b9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.446365 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f7d4f35-145c-4af9-9f4b-de8700877370-kube-api-access-p2f6g" (OuterVolumeSpecName: "kube-api-access-p2f6g") pod "2f7d4f35-145c-4af9-9f4b-de8700877370" (UID: "2f7d4f35-145c-4af9-9f4b-de8700877370"). InnerVolumeSpecName "kube-api-access-p2f6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.451460 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "788f2464-05b4-4c9a-bd83-6c1365740166" (UID: "788f2464-05b4-4c9a-bd83-6c1365740166"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.460177 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "788f2464-05b4-4c9a-bd83-6c1365740166" (UID: "788f2464-05b4-4c9a-bd83-6c1365740166"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.466664 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed2e3be8-465e-4b20-9586-387cd8d9ca67-kube-api-access-cfwmv" (OuterVolumeSpecName: "kube-api-access-cfwmv") pod "ed2e3be8-465e-4b20-9586-387cd8d9ca67" (UID: "ed2e3be8-465e-4b20-9586-387cd8d9ca67"). InnerVolumeSpecName "kube-api-access-cfwmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.471031 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2f7d4f35-145c-4af9-9f4b-de8700877370" (UID: "2f7d4f35-145c-4af9-9f4b-de8700877370"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.476647 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ed2e3be8-465e-4b20-9586-387cd8d9ca67" (UID: "ed2e3be8-465e-4b20-9586-387cd8d9ca67"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.478071 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-config-data" (OuterVolumeSpecName: "config-data") pod "788f2464-05b4-4c9a-bd83-6c1365740166" (UID: "788f2464-05b4-4c9a-bd83-6c1365740166"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.479792 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f7d4f35-145c-4af9-9f4b-de8700877370" (UID: "2f7d4f35-145c-4af9-9f4b-de8700877370"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.481340 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-scripts" (OuterVolumeSpecName: "scripts") pod "788f2464-05b4-4c9a-bd83-6c1365740166" (UID: "788f2464-05b4-4c9a-bd83-6c1365740166"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.483007 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "788f2464-05b4-4c9a-bd83-6c1365740166" (UID: "788f2464-05b4-4c9a-bd83-6c1365740166"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.491053 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.507712 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ed2e3be8-465e-4b20-9586-387cd8d9ca67" (UID: "ed2e3be8-465e-4b20-9586-387cd8d9ca67"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.508600 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-config-data" (OuterVolumeSpecName: "config-data") pod "2f7d4f35-145c-4af9-9f4b-de8700877370" (UID: "2f7d4f35-145c-4af9-9f4b-de8700877370"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.510406 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "788f2464-05b4-4c9a-bd83-6c1365740166" (UID: "788f2464-05b4-4c9a-bd83-6c1365740166"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.521910 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "788f2464-05b4-4c9a-bd83-6c1365740166" (UID: "788f2464-05b4-4c9a-bd83-6c1365740166"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.529313 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed2e3be8-465e-4b20-9586-387cd8d9ca67" (UID: "ed2e3be8-465e-4b20-9586-387cd8d9ca67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532187 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f120f0d7-ba00-4502-a2f3-7c619440887a-combined-ca-bundle\") pod \"f120f0d7-ba00-4502-a2f3-7c619440887a\" (UID: \"f120f0d7-ba00-4502-a2f3-7c619440887a\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532250 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f120f0d7-ba00-4502-a2f3-7c619440887a-config-data\") pod \"f120f0d7-ba00-4502-a2f3-7c619440887a\" (UID: \"f120f0d7-ba00-4502-a2f3-7c619440887a\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532360 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n2r2\" (UniqueName: \"kubernetes.io/projected/f120f0d7-ba00-4502-a2f3-7c619440887a-kube-api-access-4n2r2\") pod \"f120f0d7-ba00-4502-a2f3-7c619440887a\" (UID: \"f120f0d7-ba00-4502-a2f3-7c619440887a\") " Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532651 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532666 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2f6g\" (UniqueName: \"kubernetes.io/projected/2f7d4f35-145c-4af9-9f4b-de8700877370-kube-api-access-p2f6g\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532676 4834 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532684 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532695 4834 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532703 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532711 4834 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532718 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532726 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532734 4834 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532742 4834 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532750 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f7d4f35-145c-4af9-9f4b-de8700877370-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532758 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532767 4834 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed2e3be8-465e-4b20-9586-387cd8d9ca67-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532776 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68b9s\" (UniqueName: \"kubernetes.io/projected/788f2464-05b4-4c9a-bd83-6c1365740166-kube-api-access-68b9s\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532784 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfwmv\" (UniqueName: \"kubernetes.io/projected/ed2e3be8-465e-4b20-9586-387cd8d9ca67-kube-api-access-cfwmv\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532792 4834 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532800 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/788f2464-05b4-4c9a-bd83-6c1365740166-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.532808 4834 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f7d4f35-145c-4af9-9f4b-de8700877370-logs\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.535385 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f120f0d7-ba00-4502-a2f3-7c619440887a-kube-api-access-4n2r2" (OuterVolumeSpecName: "kube-api-access-4n2r2") pod "f120f0d7-ba00-4502-a2f3-7c619440887a" (UID: "f120f0d7-ba00-4502-a2f3-7c619440887a"). InnerVolumeSpecName "kube-api-access-4n2r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.551655 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f120f0d7-ba00-4502-a2f3-7c619440887a-config-data" (OuterVolumeSpecName: "config-data") pod "f120f0d7-ba00-4502-a2f3-7c619440887a" (UID: "f120f0d7-ba00-4502-a2f3-7c619440887a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.557158 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f120f0d7-ba00-4502-a2f3-7c619440887a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f120f0d7-ba00-4502-a2f3-7c619440887a" (UID: "f120f0d7-ba00-4502-a2f3-7c619440887a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.567023 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00e05134-e159-40fe-9c63-a0dc406c8dee" path="/var/lib/kubelet/pods/00e05134-e159-40fe-9c63-a0dc406c8dee/volumes" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.568168 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a7721f-38a1-4a82-88ed-6f70290b5a6d" path="/var/lib/kubelet/pods/08a7721f-38a1-4a82-88ed-6f70290b5a6d/volumes" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.569924 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34aacb58-3b8d-466d-9b71-e7098b95fe8e" path="/var/lib/kubelet/pods/34aacb58-3b8d-466d-9b71-e7098b95fe8e/volumes" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.570286 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-config-data" (OuterVolumeSpecName: "config-data") pod "ed2e3be8-465e-4b20-9586-387cd8d9ca67" (UID: "ed2e3be8-465e-4b20-9586-387cd8d9ca67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.570932 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9809d14f-10d2-479f-94d9-5b3ae7f49e7b" path="/var/lib/kubelet/pods/9809d14f-10d2-479f-94d9-5b3ae7f49e7b/volumes" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.571612 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0f06117-94bf-4e56-b5f7-e83eda8ee811" path="/var/lib/kubelet/pods/b0f06117-94bf-4e56-b5f7-e83eda8ee811/volumes" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.572863 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4629ae3-d685-43c9-81fd-49e84abd427f" path="/var/lib/kubelet/pods/e4629ae3-d685-43c9-81fd-49e84abd427f/volumes" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.635870 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed2e3be8-465e-4b20-9586-387cd8d9ca67-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.635910 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f120f0d7-ba00-4502-a2f3-7c619440887a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.635921 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f120f0d7-ba00-4502-a2f3-7c619440887a-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.635931 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n2r2\" (UniqueName: \"kubernetes.io/projected/f120f0d7-ba00-4502-a2f3-7c619440887a-kube-api-access-4n2r2\") on node \"crc\" DevicePath \"\"" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.840403 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-57cf4d469b-9sj2l" event={"ID":"788f2464-05b4-4c9a-bd83-6c1365740166","Type":"ContainerDied","Data":"d84cd735daabc7206b8d9676421b437b88d922ac103ce0d626c8b224e8af4f31"} Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.840811 4834 scope.go:117] "RemoveContainer" containerID="1604636fd334c22a16f8d495172a59b50e33f1ed2b35af73ba1fc55f9dd3f3c9" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.840452 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-57cf4d469b-9sj2l" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.842807 4834 generic.go:334] "Generic (PLEG): container finished" podID="f120f0d7-ba00-4502-a2f3-7c619440887a" containerID="49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189" exitCode=0 Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.843173 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f120f0d7-ba00-4502-a2f3-7c619440887a","Type":"ContainerDied","Data":"49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189"} Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.845105 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f120f0d7-ba00-4502-a2f3-7c619440887a","Type":"ContainerDied","Data":"7689a28d2a430320b843a86f66429ec6661809e8039c4f8e35600b4b4812a94e"} Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.845954 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.856691 4834 generic.go:334] "Generic (PLEG): container finished" podID="2f7d4f35-145c-4af9-9f4b-de8700877370" containerID="91bc866a40d8cd6ae97cd0196df92bd74f8bc6d69ab347d97b4b45e357486e30" exitCode=0 Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.856729 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7779b9cfc5-lq477" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.856772 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7779b9cfc5-lq477" event={"ID":"2f7d4f35-145c-4af9-9f4b-de8700877370","Type":"ContainerDied","Data":"91bc866a40d8cd6ae97cd0196df92bd74f8bc6d69ab347d97b4b45e357486e30"} Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.856802 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7779b9cfc5-lq477" event={"ID":"2f7d4f35-145c-4af9-9f4b-de8700877370","Type":"ContainerDied","Data":"99e7181eaa3c23e0cb2c438664d40eff23644d038c17f82ce48912c5aa714f94"} Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.864708 4834 generic.go:334] "Generic (PLEG): container finished" podID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerID="cf0ae15b1941b4dd9f887d569bab53002471c7132e035bbb1f9a2f478ae20d66" exitCode=0 Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.864745 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2e3be8-465e-4b20-9586-387cd8d9ca67","Type":"ContainerDied","Data":"cf0ae15b1941b4dd9f887d569bab53002471c7132e035bbb1f9a2f478ae20d66"} Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.864767 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed2e3be8-465e-4b20-9586-387cd8d9ca67","Type":"ContainerDied","Data":"b367283040987c5997b885caf9fbb3a88c0c67915cda01342f2bf40fc24e3a58"} Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.864816 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.889888 4834 scope.go:117] "RemoveContainer" containerID="49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.890011 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-57cf4d469b-9sj2l"] Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.893243 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-57cf4d469b-9sj2l"] Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.898910 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7779b9cfc5-lq477"] Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.903317 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7779b9cfc5-lq477"] Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.910936 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.914996 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.939824 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.945766 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.990485 4834 scope.go:117] "RemoveContainer" containerID="49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189" Oct 08 22:46:57 crc kubenswrapper[4834]: E1008 22:46:57.991023 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189\": container with ID starting with 49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189 not found: ID does not exist" containerID="49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.991085 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189"} err="failed to get container status \"49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189\": rpc error: code = NotFound desc = could not find container \"49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189\": container with ID starting with 49d4bf42135d2bfa7b426036e80f71b698c04d351d2593a2e2f19d34a6588189 not found: ID does not exist" Oct 08 22:46:57 crc kubenswrapper[4834]: I1008 22:46:57.991125 4834 scope.go:117] "RemoveContainer" containerID="91bc866a40d8cd6ae97cd0196df92bd74f8bc6d69ab347d97b4b45e357486e30" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.010798 4834 scope.go:117] "RemoveContainer" containerID="49a4837887871793d5bd61522d4486481e468e9ccbe44afb693d4cf994046387" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.029511 4834 scope.go:117] "RemoveContainer" containerID="91bc866a40d8cd6ae97cd0196df92bd74f8bc6d69ab347d97b4b45e357486e30" Oct 08 22:46:58 crc kubenswrapper[4834]: E1008 22:46:58.029869 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91bc866a40d8cd6ae97cd0196df92bd74f8bc6d69ab347d97b4b45e357486e30\": container with ID starting with 91bc866a40d8cd6ae97cd0196df92bd74f8bc6d69ab347d97b4b45e357486e30 not found: ID does not exist" containerID="91bc866a40d8cd6ae97cd0196df92bd74f8bc6d69ab347d97b4b45e357486e30" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.029928 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91bc866a40d8cd6ae97cd0196df92bd74f8bc6d69ab347d97b4b45e357486e30"} err="failed to get container status \"91bc866a40d8cd6ae97cd0196df92bd74f8bc6d69ab347d97b4b45e357486e30\": rpc error: code = NotFound desc = could not find container \"91bc866a40d8cd6ae97cd0196df92bd74f8bc6d69ab347d97b4b45e357486e30\": container with ID starting with 91bc866a40d8cd6ae97cd0196df92bd74f8bc6d69ab347d97b4b45e357486e30 not found: ID does not exist" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.029963 4834 scope.go:117] "RemoveContainer" containerID="49a4837887871793d5bd61522d4486481e468e9ccbe44afb693d4cf994046387" Oct 08 22:46:58 crc kubenswrapper[4834]: E1008 22:46:58.030270 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49a4837887871793d5bd61522d4486481e468e9ccbe44afb693d4cf994046387\": container with ID starting with 49a4837887871793d5bd61522d4486481e468e9ccbe44afb693d4cf994046387 not found: ID does not exist" containerID="49a4837887871793d5bd61522d4486481e468e9ccbe44afb693d4cf994046387" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.030301 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49a4837887871793d5bd61522d4486481e468e9ccbe44afb693d4cf994046387"} err="failed to get container status \"49a4837887871793d5bd61522d4486481e468e9ccbe44afb693d4cf994046387\": rpc error: code = NotFound desc = could not find container \"49a4837887871793d5bd61522d4486481e468e9ccbe44afb693d4cf994046387\": container with ID starting with 49a4837887871793d5bd61522d4486481e468e9ccbe44afb693d4cf994046387 not found: ID does not exist" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.030325 4834 scope.go:117] "RemoveContainer" containerID="1032f4edc7088e996cbec692eba1d575d4cb67e5112593147d72e8f6a8aadd15" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.050650 4834 scope.go:117] "RemoveContainer" containerID="6c68965f390b060b5d3c620a960a9119d4e278229de82c1adb3ab74e578a87f2" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.069666 4834 scope.go:117] "RemoveContainer" containerID="cf0ae15b1941b4dd9f887d569bab53002471c7132e035bbb1f9a2f478ae20d66" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.098897 4834 scope.go:117] "RemoveContainer" containerID="647473fbbcb274c0e2ff66facfea42e5a01db54b90650c2eb3272770f97d4568" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.133181 4834 scope.go:117] "RemoveContainer" containerID="1032f4edc7088e996cbec692eba1d575d4cb67e5112593147d72e8f6a8aadd15" Oct 08 22:46:58 crc kubenswrapper[4834]: E1008 22:46:58.133535 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1032f4edc7088e996cbec692eba1d575d4cb67e5112593147d72e8f6a8aadd15\": container with ID starting with 1032f4edc7088e996cbec692eba1d575d4cb67e5112593147d72e8f6a8aadd15 not found: ID does not exist" containerID="1032f4edc7088e996cbec692eba1d575d4cb67e5112593147d72e8f6a8aadd15" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.133562 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1032f4edc7088e996cbec692eba1d575d4cb67e5112593147d72e8f6a8aadd15"} err="failed to get container status \"1032f4edc7088e996cbec692eba1d575d4cb67e5112593147d72e8f6a8aadd15\": rpc error: code = NotFound desc = could not find container \"1032f4edc7088e996cbec692eba1d575d4cb67e5112593147d72e8f6a8aadd15\": container with ID starting with 1032f4edc7088e996cbec692eba1d575d4cb67e5112593147d72e8f6a8aadd15 not found: ID does not exist" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.133581 4834 scope.go:117] "RemoveContainer" containerID="6c68965f390b060b5d3c620a960a9119d4e278229de82c1adb3ab74e578a87f2" Oct 08 22:46:58 crc kubenswrapper[4834]: E1008 22:46:58.133854 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c68965f390b060b5d3c620a960a9119d4e278229de82c1adb3ab74e578a87f2\": container with ID starting with 6c68965f390b060b5d3c620a960a9119d4e278229de82c1adb3ab74e578a87f2 not found: ID does not exist" containerID="6c68965f390b060b5d3c620a960a9119d4e278229de82c1adb3ab74e578a87f2" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.133880 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c68965f390b060b5d3c620a960a9119d4e278229de82c1adb3ab74e578a87f2"} err="failed to get container status \"6c68965f390b060b5d3c620a960a9119d4e278229de82c1adb3ab74e578a87f2\": rpc error: code = NotFound desc = could not find container \"6c68965f390b060b5d3c620a960a9119d4e278229de82c1adb3ab74e578a87f2\": container with ID starting with 6c68965f390b060b5d3c620a960a9119d4e278229de82c1adb3ab74e578a87f2 not found: ID does not exist" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.133898 4834 scope.go:117] "RemoveContainer" containerID="cf0ae15b1941b4dd9f887d569bab53002471c7132e035bbb1f9a2f478ae20d66" Oct 08 22:46:58 crc kubenswrapper[4834]: E1008 22:46:58.134075 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0ae15b1941b4dd9f887d569bab53002471c7132e035bbb1f9a2f478ae20d66\": container with ID starting with cf0ae15b1941b4dd9f887d569bab53002471c7132e035bbb1f9a2f478ae20d66 not found: ID does not exist" containerID="cf0ae15b1941b4dd9f887d569bab53002471c7132e035bbb1f9a2f478ae20d66" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.134093 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0ae15b1941b4dd9f887d569bab53002471c7132e035bbb1f9a2f478ae20d66"} err="failed to get container status \"cf0ae15b1941b4dd9f887d569bab53002471c7132e035bbb1f9a2f478ae20d66\": rpc error: code = NotFound desc = could not find container \"cf0ae15b1941b4dd9f887d569bab53002471c7132e035bbb1f9a2f478ae20d66\": container with ID starting with cf0ae15b1941b4dd9f887d569bab53002471c7132e035bbb1f9a2f478ae20d66 not found: ID does not exist" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.134108 4834 scope.go:117] "RemoveContainer" containerID="647473fbbcb274c0e2ff66facfea42e5a01db54b90650c2eb3272770f97d4568" Oct 08 22:46:58 crc kubenswrapper[4834]: E1008 22:46:58.134304 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"647473fbbcb274c0e2ff66facfea42e5a01db54b90650c2eb3272770f97d4568\": container with ID starting with 647473fbbcb274c0e2ff66facfea42e5a01db54b90650c2eb3272770f97d4568 not found: ID does not exist" containerID="647473fbbcb274c0e2ff66facfea42e5a01db54b90650c2eb3272770f97d4568" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.134323 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"647473fbbcb274c0e2ff66facfea42e5a01db54b90650c2eb3272770f97d4568"} err="failed to get container status \"647473fbbcb274c0e2ff66facfea42e5a01db54b90650c2eb3272770f97d4568\": rpc error: code = NotFound desc = could not find container \"647473fbbcb274c0e2ff66facfea42e5a01db54b90650c2eb3272770f97d4568\": container with ID starting with 647473fbbcb274c0e2ff66facfea42e5a01db54b90650c2eb3272770f97d4568 not found: ID does not exist" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.912210 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-9f9b7d4b4-cr99t" podUID="a163bab0-7bd2-4272-a1f0-cd0090eed141" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.162:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 08 22:46:58 crc kubenswrapper[4834]: I1008 22:46:58.912243 4834 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-9f9b7d4b4-cr99t" podUID="a163bab0-7bd2-4272-a1f0-cd0090eed141" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.162:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 08 22:46:59 crc kubenswrapper[4834]: E1008 22:46:59.192883 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:46:59 crc kubenswrapper[4834]: E1008 22:46:59.193705 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:46:59 crc kubenswrapper[4834]: E1008 22:46:59.194527 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:46:59 crc kubenswrapper[4834]: E1008 22:46:59.194583 4834 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovsdb-server" Oct 08 22:46:59 crc kubenswrapper[4834]: E1008 22:46:59.194803 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:46:59 crc kubenswrapper[4834]: E1008 22:46:59.197309 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:46:59 crc kubenswrapper[4834]: E1008 22:46:59.199466 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:46:59 crc kubenswrapper[4834]: E1008 22:46:59.199551 4834 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovs-vswitchd" Oct 08 22:46:59 crc kubenswrapper[4834]: I1008 22:46:59.571054 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f7d4f35-145c-4af9-9f4b-de8700877370" path="/var/lib/kubelet/pods/2f7d4f35-145c-4af9-9f4b-de8700877370/volumes" Oct 08 22:46:59 crc kubenswrapper[4834]: I1008 22:46:59.572482 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="788f2464-05b4-4c9a-bd83-6c1365740166" path="/var/lib/kubelet/pods/788f2464-05b4-4c9a-bd83-6c1365740166/volumes" Oct 08 22:46:59 crc kubenswrapper[4834]: I1008 22:46:59.574265 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" path="/var/lib/kubelet/pods/ed2e3be8-465e-4b20-9586-387cd8d9ca67/volumes" Oct 08 22:46:59 crc kubenswrapper[4834]: I1008 22:46:59.576551 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f120f0d7-ba00-4502-a2f3-7c619440887a" path="/var/lib/kubelet/pods/f120f0d7-ba00-4502-a2f3-7c619440887a/volumes" Oct 08 22:47:04 crc kubenswrapper[4834]: E1008 22:47:04.192055 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:47:04 crc kubenswrapper[4834]: E1008 22:47:04.192921 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:47:04 crc kubenswrapper[4834]: E1008 22:47:04.193667 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:47:04 crc kubenswrapper[4834]: E1008 22:47:04.193760 4834 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovsdb-server" Oct 08 22:47:04 crc kubenswrapper[4834]: E1008 22:47:04.194023 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:47:04 crc kubenswrapper[4834]: E1008 22:47:04.195791 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:47:04 crc kubenswrapper[4834]: E1008 22:47:04.197180 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:47:04 crc kubenswrapper[4834]: E1008 22:47:04.197265 4834 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovs-vswitchd" Oct 08 22:47:04 crc kubenswrapper[4834]: I1008 22:47:04.938644 4834 generic.go:334] "Generic (PLEG): container finished" podID="62795e13-2e9c-4656-ab88-8788e50d37c5" containerID="0994fee875d4da6ff390c9c0551593b552bdf75310faf54c410141d37e0f066c" exitCode=0 Oct 08 22:47:04 crc kubenswrapper[4834]: I1008 22:47:04.938868 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f6cc747c5-vzjm2" event={"ID":"62795e13-2e9c-4656-ab88-8788e50d37c5","Type":"ContainerDied","Data":"0994fee875d4da6ff390c9c0551593b552bdf75310faf54c410141d37e0f066c"} Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.215753 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.373752 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-config\") pod \"62795e13-2e9c-4656-ab88-8788e50d37c5\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.373914 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-ovndb-tls-certs\") pod \"62795e13-2e9c-4656-ab88-8788e50d37c5\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.374066 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-public-tls-certs\") pod \"62795e13-2e9c-4656-ab88-8788e50d37c5\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.374103 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-httpd-config\") pod \"62795e13-2e9c-4656-ab88-8788e50d37c5\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.374185 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-internal-tls-certs\") pod \"62795e13-2e9c-4656-ab88-8788e50d37c5\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.374246 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-combined-ca-bundle\") pod \"62795e13-2e9c-4656-ab88-8788e50d37c5\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.374299 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqbkx\" (UniqueName: \"kubernetes.io/projected/62795e13-2e9c-4656-ab88-8788e50d37c5-kube-api-access-wqbkx\") pod \"62795e13-2e9c-4656-ab88-8788e50d37c5\" (UID: \"62795e13-2e9c-4656-ab88-8788e50d37c5\") " Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.381404 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62795e13-2e9c-4656-ab88-8788e50d37c5-kube-api-access-wqbkx" (OuterVolumeSpecName: "kube-api-access-wqbkx") pod "62795e13-2e9c-4656-ab88-8788e50d37c5" (UID: "62795e13-2e9c-4656-ab88-8788e50d37c5"). InnerVolumeSpecName "kube-api-access-wqbkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.385981 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "62795e13-2e9c-4656-ab88-8788e50d37c5" (UID: "62795e13-2e9c-4656-ab88-8788e50d37c5"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.427398 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "62795e13-2e9c-4656-ab88-8788e50d37c5" (UID: "62795e13-2e9c-4656-ab88-8788e50d37c5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.441645 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "62795e13-2e9c-4656-ab88-8788e50d37c5" (UID: "62795e13-2e9c-4656-ab88-8788e50d37c5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.444670 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-config" (OuterVolumeSpecName: "config") pod "62795e13-2e9c-4656-ab88-8788e50d37c5" (UID: "62795e13-2e9c-4656-ab88-8788e50d37c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.448970 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62795e13-2e9c-4656-ab88-8788e50d37c5" (UID: "62795e13-2e9c-4656-ab88-8788e50d37c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.473245 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "62795e13-2e9c-4656-ab88-8788e50d37c5" (UID: "62795e13-2e9c-4656-ab88-8788e50d37c5"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.476228 4834 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.476262 4834 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.476274 4834 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.476286 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.476297 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqbkx\" (UniqueName: \"kubernetes.io/projected/62795e13-2e9c-4656-ab88-8788e50d37c5-kube-api-access-wqbkx\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.476312 4834 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-config\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.476323 4834 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/62795e13-2e9c-4656-ab88-8788e50d37c5-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.952618 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f6cc747c5-vzjm2" event={"ID":"62795e13-2e9c-4656-ab88-8788e50d37c5","Type":"ContainerDied","Data":"51b9abfcc3e72b36455f7cd5a244162d8f78ed0aca8fdc256e54b5fb3b57ab7e"} Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.952668 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f6cc747c5-vzjm2" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.952680 4834 scope.go:117] "RemoveContainer" containerID="4c640bd73e780e0d64b84db93f10b09fd43cf57588febe8f44765e1d6c226f04" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.980065 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f6cc747c5-vzjm2"] Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.988816 4834 scope.go:117] "RemoveContainer" containerID="0994fee875d4da6ff390c9c0551593b552bdf75310faf54c410141d37e0f066c" Oct 08 22:47:05 crc kubenswrapper[4834]: I1008 22:47:05.989565 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6f6cc747c5-vzjm2"] Oct 08 22:47:07 crc kubenswrapper[4834]: I1008 22:47:07.568202 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62795e13-2e9c-4656-ab88-8788e50d37c5" path="/var/lib/kubelet/pods/62795e13-2e9c-4656-ab88-8788e50d37c5/volumes" Oct 08 22:47:09 crc kubenswrapper[4834]: E1008 22:47:09.193331 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:47:09 crc kubenswrapper[4834]: E1008 22:47:09.194417 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:47:09 crc kubenswrapper[4834]: E1008 22:47:09.194998 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:47:09 crc kubenswrapper[4834]: E1008 22:47:09.195057 4834 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovsdb-server" Oct 08 22:47:09 crc kubenswrapper[4834]: E1008 22:47:09.195261 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:47:09 crc kubenswrapper[4834]: E1008 22:47:09.197483 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:47:09 crc kubenswrapper[4834]: E1008 22:47:09.199406 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:47:09 crc kubenswrapper[4834]: E1008 22:47:09.199480 4834 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovs-vswitchd" Oct 08 22:47:14 crc kubenswrapper[4834]: E1008 22:47:14.192075 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:47:14 crc kubenswrapper[4834]: E1008 22:47:14.192859 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:47:14 crc kubenswrapper[4834]: E1008 22:47:14.193523 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:47:14 crc kubenswrapper[4834]: E1008 22:47:14.193600 4834 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovsdb-server" Oct 08 22:47:14 crc kubenswrapper[4834]: E1008 22:47:14.194009 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:47:14 crc kubenswrapper[4834]: E1008 22:47:14.195940 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:47:14 crc kubenswrapper[4834]: E1008 22:47:14.198487 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:47:14 crc kubenswrapper[4834]: E1008 22:47:14.198571 4834 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovs-vswitchd" Oct 08 22:47:18 crc kubenswrapper[4834]: I1008 22:47:18.262265 4834 scope.go:117] "RemoveContainer" containerID="64fd98bb69dac7ab8ba350b5c61193e765d42f583c0863b0f4e186a2f503569a" Oct 08 22:47:18 crc kubenswrapper[4834]: I1008 22:47:18.292617 4834 scope.go:117] "RemoveContainer" containerID="a188889565e6b65f10154bc8da82638cedd8d784828f5cb0e1798acec24eeead" Oct 08 22:47:18 crc kubenswrapper[4834]: I1008 22:47:18.343940 4834 scope.go:117] "RemoveContainer" containerID="ca24648ca7cb1d935733fd8d6a5dae57fa7a4565f69f1abcdbe7a629bc83a777" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.095720 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c","Type":"ContainerDied","Data":"0caa48090b97f4cd0f143f8b3522146daa146a232e6629762e868998fa0cbab2"} Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.095625 4834 generic.go:334] "Generic (PLEG): container finished" podID="9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" containerID="0caa48090b97f4cd0f143f8b3522146daa146a232e6629762e868998fa0cbab2" exitCode=137 Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.126704 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerID="3fd81fba6fef2c5e63023d49c353e82227e744ef65163a821e731e717fb6624a" exitCode=137 Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.126782 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"3fd81fba6fef2c5e63023d49c353e82227e744ef65163a821e731e717fb6624a"} Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.130252 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jmkzz_77e43c87-585d-4d7c-bd16-ab66b531e024/ovs-vswitchd/0.log" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.131065 4834 generic.go:334] "Generic (PLEG): container finished" podID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" exitCode=137 Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.131116 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jmkzz" event={"ID":"77e43c87-585d-4d7c-bd16-ab66b531e024","Type":"ContainerDied","Data":"dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799"} Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.131202 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jmkzz" event={"ID":"77e43c87-585d-4d7c-bd16-ab66b531e024","Type":"ContainerDied","Data":"76706c2eee92568380cfe1f5f9dc6e6ad86ac08ebce8672626360217a3030472"} Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.131268 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76706c2eee92568380cfe1f5f9dc6e6ad86ac08ebce8672626360217a3030472" Oct 08 22:47:19 crc kubenswrapper[4834]: E1008 22:47:19.195060 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799 is running failed: container process not found" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:47:19 crc kubenswrapper[4834]: E1008 22:47:19.195442 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:47:19 crc kubenswrapper[4834]: E1008 22:47:19.195954 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799 is running failed: container process not found" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:47:19 crc kubenswrapper[4834]: E1008 22:47:19.196006 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:47:19 crc kubenswrapper[4834]: E1008 22:47:19.196946 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799 is running failed: container process not found" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 08 22:47:19 crc kubenswrapper[4834]: E1008 22:47:19.196979 4834 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovs-vswitchd" Oct 08 22:47:19 crc kubenswrapper[4834]: E1008 22:47:19.197070 4834 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 08 22:47:19 crc kubenswrapper[4834]: E1008 22:47:19.197131 4834 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-jmkzz" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovsdb-server" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.255059 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jmkzz_77e43c87-585d-4d7c-bd16-ab66b531e024/ovs-vswitchd/0.log" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.255961 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.385976 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77e43c87-585d-4d7c-bd16-ab66b531e024-scripts\") pod \"77e43c87-585d-4d7c-bd16-ab66b531e024\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.386018 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-run\") pod \"77e43c87-585d-4d7c-bd16-ab66b531e024\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.386039 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqg8b\" (UniqueName: \"kubernetes.io/projected/77e43c87-585d-4d7c-bd16-ab66b531e024-kube-api-access-vqg8b\") pod \"77e43c87-585d-4d7c-bd16-ab66b531e024\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.386056 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-lib\") pod \"77e43c87-585d-4d7c-bd16-ab66b531e024\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.386077 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-etc-ovs\") pod \"77e43c87-585d-4d7c-bd16-ab66b531e024\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.386124 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-log\") pod \"77e43c87-585d-4d7c-bd16-ab66b531e024\" (UID: \"77e43c87-585d-4d7c-bd16-ab66b531e024\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.386440 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-run" (OuterVolumeSpecName: "var-run") pod "77e43c87-585d-4d7c-bd16-ab66b531e024" (UID: "77e43c87-585d-4d7c-bd16-ab66b531e024"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.386498 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-lib" (OuterVolumeSpecName: "var-lib") pod "77e43c87-585d-4d7c-bd16-ab66b531e024" (UID: "77e43c87-585d-4d7c-bd16-ab66b531e024"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.386532 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "77e43c87-585d-4d7c-bd16-ab66b531e024" (UID: "77e43c87-585d-4d7c-bd16-ab66b531e024"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.386647 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-log" (OuterVolumeSpecName: "var-log") pod "77e43c87-585d-4d7c-bd16-ab66b531e024" (UID: "77e43c87-585d-4d7c-bd16-ab66b531e024"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.387689 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77e43c87-585d-4d7c-bd16-ab66b531e024-scripts" (OuterVolumeSpecName: "scripts") pod "77e43c87-585d-4d7c-bd16-ab66b531e024" (UID: "77e43c87-585d-4d7c-bd16-ab66b531e024"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.387869 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/77e43c87-585d-4d7c-bd16-ab66b531e024-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.387887 4834 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-run\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.387899 4834 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-lib\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.387909 4834 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.387922 4834 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/77e43c87-585d-4d7c-bd16-ab66b531e024-var-log\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.397772 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77e43c87-585d-4d7c-bd16-ab66b531e024-kube-api-access-vqg8b" (OuterVolumeSpecName: "kube-api-access-vqg8b") pod "77e43c87-585d-4d7c-bd16-ab66b531e024" (UID: "77e43c87-585d-4d7c-bd16-ab66b531e024"). InnerVolumeSpecName "kube-api-access-vqg8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.450468 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.462208 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.489468 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqg8b\" (UniqueName: \"kubernetes.io/projected/77e43c87-585d-4d7c-bd16-ab66b531e024-kube-api-access-vqg8b\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.591225 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-etc-machine-id\") pod \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.591288 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6ab95611-95ff-46bf-9b06-2ed44a58fa46-lock\") pod \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.591324 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" (UID: "9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.591327 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-combined-ca-bundle\") pod \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.591390 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.591409 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-config-data\") pod \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.591455 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbhrb\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-kube-api-access-dbhrb\") pod \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.591478 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l82nh\" (UniqueName: \"kubernetes.io/projected/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-kube-api-access-l82nh\") pod \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.591513 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-scripts\") pod \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.591570 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift\") pod \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.591595 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-config-data-custom\") pod \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\" (UID: \"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.591626 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6ab95611-95ff-46bf-9b06-2ed44a58fa46-cache\") pod \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\" (UID: \"6ab95611-95ff-46bf-9b06-2ed44a58fa46\") " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.591871 4834 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.592182 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ab95611-95ff-46bf-9b06-2ed44a58fa46-lock" (OuterVolumeSpecName: "lock") pod "6ab95611-95ff-46bf-9b06-2ed44a58fa46" (UID: "6ab95611-95ff-46bf-9b06-2ed44a58fa46"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.592259 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ab95611-95ff-46bf-9b06-2ed44a58fa46-cache" (OuterVolumeSpecName: "cache") pod "6ab95611-95ff-46bf-9b06-2ed44a58fa46" (UID: "6ab95611-95ff-46bf-9b06-2ed44a58fa46"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.596053 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-kube-api-access-dbhrb" (OuterVolumeSpecName: "kube-api-access-dbhrb") pod "6ab95611-95ff-46bf-9b06-2ed44a58fa46" (UID: "6ab95611-95ff-46bf-9b06-2ed44a58fa46"). InnerVolumeSpecName "kube-api-access-dbhrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.596478 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "swift") pod "6ab95611-95ff-46bf-9b06-2ed44a58fa46" (UID: "6ab95611-95ff-46bf-9b06-2ed44a58fa46"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.597788 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-scripts" (OuterVolumeSpecName: "scripts") pod "9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" (UID: "9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.598413 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" (UID: "9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.598764 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6ab95611-95ff-46bf-9b06-2ed44a58fa46" (UID: "6ab95611-95ff-46bf-9b06-2ed44a58fa46"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.599451 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-kube-api-access-l82nh" (OuterVolumeSpecName: "kube-api-access-l82nh") pod "9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" (UID: "9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c"). InnerVolumeSpecName "kube-api-access-l82nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.628317 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" (UID: "9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.670209 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-config-data" (OuterVolumeSpecName: "config-data") pod "9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" (UID: "9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.693650 4834 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.693719 4834 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.693734 4834 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6ab95611-95ff-46bf-9b06-2ed44a58fa46-cache\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.693746 4834 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6ab95611-95ff-46bf-9b06-2ed44a58fa46-lock\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.693757 4834 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.693814 4834 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.693828 4834 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-config-data\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.693840 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbhrb\" (UniqueName: \"kubernetes.io/projected/6ab95611-95ff-46bf-9b06-2ed44a58fa46-kube-api-access-dbhrb\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.693891 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l82nh\" (UniqueName: \"kubernetes.io/projected/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-kube-api-access-l82nh\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.693913 4834 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c-scripts\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.712910 4834 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 08 22:47:19 crc kubenswrapper[4834]: I1008 22:47:19.795565 4834 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.146740 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.146728 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c","Type":"ContainerDied","Data":"55a61ae21161f808e2370ed0ef72b15c1c6f81d7ea2a20b590d081b02eb38a04"} Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.146999 4834 scope.go:117] "RemoveContainer" containerID="16284da8c6f35291d6b17fb6dfd3cb470e7f9c4ec4f746f4bc0659d2fb85b5fb" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.160061 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ab95611-95ff-46bf-9b06-2ed44a58fa46","Type":"ContainerDied","Data":"4a6704ede3b4e4fe0891a2208f043b040a53ddfdb0c9c606a38792df9ed39863"} Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.160076 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jmkzz" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.160207 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.204295 4834 scope.go:117] "RemoveContainer" containerID="0caa48090b97f4cd0f143f8b3522146daa146a232e6629762e868998fa0cbab2" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.211910 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-jmkzz"] Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.221986 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-jmkzz"] Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.226850 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.234463 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.241801 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.247737 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.263165 4834 scope.go:117] "RemoveContainer" containerID="3fd81fba6fef2c5e63023d49c353e82227e744ef65163a821e731e717fb6624a" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.293755 4834 scope.go:117] "RemoveContainer" containerID="fd0fe5fab5d6566b676108078b9bb8956f0d4000e18eeb98d65adfb995ed66db" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.321881 4834 scope.go:117] "RemoveContainer" containerID="cc2d791ec077a375cc5534fb4333d22ce2bfa09e913fe297560734c663737cc0" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.348700 4834 scope.go:117] "RemoveContainer" containerID="d032b4a1902d533ed5cdb4bbf1a9f2c37094a94a35f3ce8f02ac5f212367ceee" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.372836 4834 scope.go:117] "RemoveContainer" containerID="f098a2e0cb0624cb3fe14c1c933baa20366c8520017da6de05a7436114e9e875" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.412278 4834 scope.go:117] "RemoveContainer" containerID="3f5e1b6ecf4bdb7095f6e4f54237dd8d5e0fce913b21ceb7e2e9d3bbe4da4702" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.454763 4834 scope.go:117] "RemoveContainer" containerID="916771be477f120fd96b5a1f5443d68a2ee7c86b4f15262feb8c7da880418d12" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.475129 4834 scope.go:117] "RemoveContainer" containerID="f652c6fc5992ba73c5161286567da0fcc0cd2540cfc9653f9d3e00b5ca106caa" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.505754 4834 scope.go:117] "RemoveContainer" containerID="6ce010fd5f1a7322ee19e0d85fe1859b18c483f1e6cee129d2324a72aca8c9ae" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.535884 4834 scope.go:117] "RemoveContainer" containerID="2091dbf498187923ad07111022a998400d80f39064fb32a84e0b373a124b3d4d" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.565173 4834 scope.go:117] "RemoveContainer" containerID="d8010142af61d9a4bcf191642a46ef203b560507fdc751d54f09b14df4705c8f" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.592840 4834 scope.go:117] "RemoveContainer" containerID="32976af02fd539929cf231bf96ee9923fbea70a134e86a660abf29e813f31e4c" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.618084 4834 scope.go:117] "RemoveContainer" containerID="5523f6432e877d33c1dd624b005e3a10c13f5c90d768a9ec5b33eb492f9cd80b" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.647212 4834 scope.go:117] "RemoveContainer" containerID="7514b6a8989d60bf273471f9730c93e60d545194b6408566793dc09f41f4ed2c" Oct 08 22:47:20 crc kubenswrapper[4834]: I1008 22:47:20.679754 4834 scope.go:117] "RemoveContainer" containerID="06fd06ac9cbc7b6f70094375f8d6b9bf90833b141ddadf2cad95017b587d05a9" Oct 08 22:47:21 crc kubenswrapper[4834]: I1008 22:47:21.572997 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" path="/var/lib/kubelet/pods/6ab95611-95ff-46bf-9b06-2ed44a58fa46/volumes" Oct 08 22:47:21 crc kubenswrapper[4834]: I1008 22:47:21.578407 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" path="/var/lib/kubelet/pods/77e43c87-585d-4d7c-bd16-ab66b531e024/volumes" Oct 08 22:47:21 crc kubenswrapper[4834]: I1008 22:47:21.579826 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" path="/var/lib/kubelet/pods/9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c/volumes" Oct 08 22:47:21 crc kubenswrapper[4834]: I1008 22:47:21.587315 4834 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podb8b7117c-11cc-4ba9-bd98-e25e6a56d8a6"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podb8b7117c-11cc-4ba9-bd98-e25e6a56d8a6] : Timed out while waiting for systemd to remove kubepods-besteffort-podb8b7117c_11cc_4ba9_bd98_e25e6a56d8a6.slice" Oct 08 22:47:21 crc kubenswrapper[4834]: E1008 22:47:21.587403 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podb8b7117c-11cc-4ba9-bd98-e25e6a56d8a6] : unable to destroy cgroup paths for cgroup [kubepods besteffort podb8b7117c-11cc-4ba9-bd98-e25e6a56d8a6] : Timed out while waiting for systemd to remove kubepods-besteffort-podb8b7117c_11cc_4ba9_bd98_e25e6a56d8a6.slice" pod="openstack/openstackclient" podUID="b8b7117c-11cc-4ba9-bd98-e25e6a56d8a6" Oct 08 22:47:22 crc kubenswrapper[4834]: I1008 22:47:22.186002 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.610575 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rpvcv"] Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611520 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92213f20-28bf-4fe1-b547-6867677b0049" containerName="nova-cell1-conductor-conductor" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611538 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="92213f20-28bf-4fe1-b547-6867677b0049" containerName="nova-cell1-conductor-conductor" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611557 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12663254-035f-4057-b178-2dc4d42db157" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611565 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="12663254-035f-4057-b178-2dc4d42db157" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611574 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b73c297-7a02-46b4-88bf-30b239655df8" containerName="kube-state-metrics" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611583 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b73c297-7a02-46b4-88bf-30b239655df8" containerName="kube-state-metrics" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611600 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-server" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611607 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-server" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611621 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9809d14f-10d2-479f-94d9-5b3ae7f49e7b" containerName="setup-container" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611628 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9809d14f-10d2-479f-94d9-5b3ae7f49e7b" containerName="setup-container" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611638 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" containerName="nova-metadata-metadata" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611646 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" containerName="nova-metadata-metadata" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611662 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovs-vswitchd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611669 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovs-vswitchd" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611680 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="sg-core" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611687 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="sg-core" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611702 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00e05134-e159-40fe-9c63-a0dc406c8dee" containerName="ovn-northd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611709 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="00e05134-e159-40fe-9c63-a0dc406c8dee" containerName="ovn-northd" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611720 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2851fb85-5e8a-46af-9cac-d4df0c5eb16a" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611727 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2851fb85-5e8a-46af-9cac-d4df0c5eb16a" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611739 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-auditor" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611747 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-auditor" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611758 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1c297e1-ec55-4113-a87d-7813a27c03d9" containerName="glance-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611765 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c297e1-ec55-4113-a87d-7813a27c03d9" containerName="glance-log" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611779 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="ceilometer-notification-agent" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611786 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="ceilometer-notification-agent" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611797 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34aacb58-3b8d-466d-9b71-e7098b95fe8e" containerName="galera" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611805 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="34aacb58-3b8d-466d-9b71-e7098b95fe8e" containerName="galera" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611819 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596135ed-4d76-4dec-94bd-cf17dfbfe2d6" containerName="dnsmasq-dns" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611828 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="596135ed-4d76-4dec-94bd-cf17dfbfe2d6" containerName="dnsmasq-dns" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611843 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596135ed-4d76-4dec-94bd-cf17dfbfe2d6" containerName="init" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611851 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="596135ed-4d76-4dec-94bd-cf17dfbfe2d6" containerName="init" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611861 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00e05134-e159-40fe-9c63-a0dc406c8dee" containerName="openstack-network-exporter" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611868 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="00e05134-e159-40fe-9c63-a0dc406c8dee" containerName="openstack-network-exporter" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611877 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1c297e1-ec55-4113-a87d-7813a27c03d9" containerName="glance-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611884 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c297e1-ec55-4113-a87d-7813a27c03d9" containerName="glance-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611898 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5d01ab-b923-4829-9b10-6ad9010216eb" containerName="placement-api" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611905 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5d01ab-b923-4829-9b10-6ad9010216eb" containerName="placement-api" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611919 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894c1f04-42d4-43de-a34a-19200ceec426" containerName="mysql-bootstrap" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611926 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="894c1f04-42d4-43de-a34a-19200ceec426" containerName="mysql-bootstrap" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611935 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9809d14f-10d2-479f-94d9-5b3ae7f49e7b" containerName="rabbitmq" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611943 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9809d14f-10d2-479f-94d9-5b3ae7f49e7b" containerName="rabbitmq" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611956 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-replicator" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611963 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-replicator" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611972 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37143980-a3f8-4398-a1d7-0f8189fb5366" containerName="glance-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.611980 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="37143980-a3f8-4398-a1d7-0f8189fb5366" containerName="glance-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.611993 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5d01ab-b923-4829-9b10-6ad9010216eb" containerName="placement-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612000 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5d01ab-b923-4829-9b10-6ad9010216eb" containerName="placement-log" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612015 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81641859-a43e-4d35-bc09-f541277c77da" containerName="nova-cell0-conductor-conductor" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612023 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="81641859-a43e-4d35-bc09-f541277c77da" containerName="nova-cell0-conductor-conductor" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612034 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701b75e6-1acc-47d0-85de-2349a6345a3b" containerName="openstack-network-exporter" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612042 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="701b75e6-1acc-47d0-85de-2349a6345a3b" containerName="openstack-network-exporter" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612054 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4629ae3-d685-43c9-81fd-49e84abd427f" containerName="nova-api-api" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612062 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4629ae3-d685-43c9-81fd-49e84abd427f" containerName="nova-api-api" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612072 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f0068c-4e61-4079-9d62-b338472e817d" containerName="ovn-controller" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612079 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f0068c-4e61-4079-9d62-b338472e817d" containerName="ovn-controller" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612088 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="swift-recon-cron" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612096 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="swift-recon-cron" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612104 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="proxy-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612111 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="proxy-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612122 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" containerName="cinder-scheduler" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612130 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" containerName="cinder-scheduler" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612170 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-server" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612181 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-server" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612194 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a20774f5-74f4-4f7f-9f33-b4b55585cb7d" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612201 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a20774f5-74f4-4f7f-9f33-b4b55585cb7d" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612216 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894c1f04-42d4-43de-a34a-19200ceec426" containerName="galera" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612223 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="894c1f04-42d4-43de-a34a-19200ceec426" containerName="galera" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612235 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f06117-94bf-4e56-b5f7-e83eda8ee811" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612242 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f06117-94bf-4e56-b5f7-e83eda8ee811" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612253 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f7d4f35-145c-4af9-9f4b-de8700877370" containerName="barbican-worker" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612260 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7d4f35-145c-4af9-9f4b-de8700877370" containerName="barbican-worker" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612274 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f70f2f55-ae76-4f8a-95a4-49933695ff6b" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612283 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f70f2f55-ae76-4f8a-95a4-49933695ff6b" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612299 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" containerName="openstack-network-exporter" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612307 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" containerName="openstack-network-exporter" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612316 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-replicator" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612323 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-replicator" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612333 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-reaper" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612341 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-reaper" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612352 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6122ff69-d6fb-4002-8679-80b826faf58f" containerName="barbican-keystone-listener-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612359 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6122ff69-d6fb-4002-8679-80b826faf58f" containerName="barbican-keystone-listener-log" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612374 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a7721f-38a1-4a82-88ed-6f70290b5a6d" containerName="rabbitmq" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612382 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a7721f-38a1-4a82-88ed-6f70290b5a6d" containerName="rabbitmq" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612393 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-updater" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612400 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-updater" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612409 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-expirer" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612416 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-expirer" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612430 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" containerName="ovsdbserver-nb" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612438 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" containerName="ovsdbserver-nb" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612452 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovsdb-server-init" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612459 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovsdb-server-init" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612472 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37143980-a3f8-4398-a1d7-0f8189fb5366" containerName="glance-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612479 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="37143980-a3f8-4398-a1d7-0f8189fb5366" containerName="glance-log" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612487 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovsdb-server" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612495 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovsdb-server" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612507 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a163bab0-7bd2-4272-a1f0-cd0090eed141" containerName="barbican-api" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612514 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a163bab0-7bd2-4272-a1f0-cd0090eed141" containerName="barbican-api" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612525 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f120f0d7-ba00-4502-a2f3-7c619440887a" containerName="nova-scheduler-scheduler" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612533 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="f120f0d7-ba00-4502-a2f3-7c619440887a" containerName="nova-scheduler-scheduler" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612542 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701b75e6-1acc-47d0-85de-2349a6345a3b" containerName="ovsdbserver-sb" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612549 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="701b75e6-1acc-47d0-85de-2349a6345a3b" containerName="ovsdbserver-sb" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612562 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-auditor" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612570 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-auditor" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612579 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f7d4f35-145c-4af9-9f4b-de8700877370" containerName="barbican-worker-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612587 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7d4f35-145c-4af9-9f4b-de8700877370" containerName="barbican-worker-log" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612599 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6122ff69-d6fb-4002-8679-80b826faf58f" containerName="barbican-keystone-listener" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612606 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6122ff69-d6fb-4002-8679-80b826faf58f" containerName="barbican-keystone-listener" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612617 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-server" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612624 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-server" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612637 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62795e13-2e9c-4656-ab88-8788e50d37c5" containerName="neutron-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612644 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="62795e13-2e9c-4656-ab88-8788e50d37c5" containerName="neutron-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612656 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-auditor" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612665 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-auditor" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612675 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="rsync" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612682 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="rsync" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612692 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62795e13-2e9c-4656-ab88-8788e50d37c5" containerName="neutron-api" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612700 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="62795e13-2e9c-4656-ab88-8788e50d37c5" containerName="neutron-api" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612712 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-replicator" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612719 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-replicator" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612730 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="788f2464-05b4-4c9a-bd83-6c1365740166" containerName="keystone-api" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612737 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="788f2464-05b4-4c9a-bd83-6c1365740166" containerName="keystone-api" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612747 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-updater" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612755 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-updater" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612766 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5aa1aef-afe2-4b70-9033-c62921f3d106" containerName="proxy-server" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612774 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5aa1aef-afe2-4b70-9033-c62921f3d106" containerName="proxy-server" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612786 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" containerName="nova-metadata-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612793 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" containerName="nova-metadata-log" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612805 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caf766d3-49fe-4a20-bf0e-405ccca15c69" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612812 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf766d3-49fe-4a20-bf0e-405ccca15c69" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612824 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9087d728-8ea1-4f0c-aff6-7dae2fd139ec" containerName="openstack-network-exporter" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612832 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9087d728-8ea1-4f0c-aff6-7dae2fd139ec" containerName="openstack-network-exporter" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.612844 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" containerName="probe" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.612851 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" containerName="probe" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.617255 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f06117-94bf-4e56-b5f7-e83eda8ee811" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617284 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f06117-94bf-4e56-b5f7-e83eda8ee811" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.617312 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34aacb58-3b8d-466d-9b71-e7098b95fe8e" containerName="mysql-bootstrap" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617322 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="34aacb58-3b8d-466d-9b71-e7098b95fe8e" containerName="mysql-bootstrap" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.617340 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="ceilometer-central-agent" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617372 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="ceilometer-central-agent" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.617390 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fede876-b44b-40e1-8c56-9c35d2528e37" containerName="cinder-api" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617399 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fede876-b44b-40e1-8c56-9c35d2528e37" containerName="cinder-api" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.617412 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4629ae3-d685-43c9-81fd-49e84abd427f" containerName="nova-api-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617421 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4629ae3-d685-43c9-81fd-49e84abd427f" containerName="nova-api-log" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.617434 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a163bab0-7bd2-4272-a1f0-cd0090eed141" containerName="barbican-api-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617444 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a163bab0-7bd2-4272-a1f0-cd0090eed141" containerName="barbican-api-log" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.617461 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fede876-b44b-40e1-8c56-9c35d2528e37" containerName="cinder-api-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617471 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fede876-b44b-40e1-8c56-9c35d2528e37" containerName="cinder-api-log" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.617486 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168e9a74-197a-4210-a553-7162c2f521af" containerName="memcached" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617496 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="168e9a74-197a-4210-a553-7162c2f521af" containerName="memcached" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.617513 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a7721f-38a1-4a82-88ed-6f70290b5a6d" containerName="setup-container" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617523 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a7721f-38a1-4a82-88ed-6f70290b5a6d" containerName="setup-container" Oct 08 22:47:23 crc kubenswrapper[4834]: E1008 22:47:23.617538 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5aa1aef-afe2-4b70-9033-c62921f3d106" containerName="proxy-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617548 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5aa1aef-afe2-4b70-9033-c62921f3d106" containerName="proxy-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617816 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="37143980-a3f8-4398-a1d7-0f8189fb5366" containerName="glance-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617844 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="34aacb58-3b8d-466d-9b71-e7098b95fe8e" containerName="galera" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617861 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f7d4f35-145c-4af9-9f4b-de8700877370" containerName="barbican-worker" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617871 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="sg-core" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617886 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-server" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617904 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="788f2464-05b4-4c9a-bd83-6c1365740166" containerName="keystone-api" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617917 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4629ae3-d685-43c9-81fd-49e84abd427f" containerName="nova-api-api" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617929 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a163bab0-7bd2-4272-a1f0-cd0090eed141" containerName="barbican-api" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617944 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f70f2f55-ae76-4f8a-95a4-49933695ff6b" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617960 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5d01ab-b923-4829-9b10-6ad9010216eb" containerName="placement-api" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617974 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" containerName="ovsdbserver-nb" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617985 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d2d9d8-6fb8-45a7-bcba-5d6121b26dda" containerName="openstack-network-exporter" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.617997 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="12663254-035f-4057-b178-2dc4d42db157" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618008 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-server" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618026 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="596135ed-4d76-4dec-94bd-cf17dfbfe2d6" containerName="dnsmasq-dns" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618043 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="swift-recon-cron" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618059 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b73c297-7a02-46b4-88bf-30b239655df8" containerName="kube-state-metrics" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618072 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" containerName="nova-metadata-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618088 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-expirer" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618102 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5aa1aef-afe2-4b70-9033-c62921f3d106" containerName="proxy-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618119 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovs-vswitchd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618135 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f052dbd-010a-456f-af57-0b6b2f6e70ad" containerName="nova-metadata-metadata" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618169 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="proxy-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618181 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a7721f-38a1-4a82-88ed-6f70290b5a6d" containerName="rabbitmq" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618196 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="ceilometer-central-agent" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618207 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2851fb85-5e8a-46af-9cac-d4df0c5eb16a" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618222 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0f06117-94bf-4e56-b5f7-e83eda8ee811" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618235 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6122ff69-d6fb-4002-8679-80b826faf58f" containerName="barbican-keystone-listener-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618250 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="77e43c87-585d-4d7c-bd16-ab66b531e024" containerName="ovsdb-server" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618262 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f120f0d7-ba00-4502-a2f3-7c619440887a" containerName="nova-scheduler-scheduler" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618274 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" containerName="probe" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618285 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5d01ab-b923-4829-9b10-6ad9010216eb" containerName="placement-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618294 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2e3be8-465e-4b20-9586-387cd8d9ca67" containerName="ceilometer-notification-agent" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618310 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-replicator" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618326 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="81641859-a43e-4d35-bc09-f541277c77da" containerName="nova-cell0-conductor-conductor" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618337 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-replicator" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618351 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-updater" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618362 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-replicator" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618374 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9809d14f-10d2-479f-94d9-5b3ae7f49e7b" containerName="rabbitmq" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618385 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fede876-b44b-40e1-8c56-9c35d2528e37" containerName="cinder-api-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618401 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fede876-b44b-40e1-8c56-9c35d2528e37" containerName="cinder-api" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618418 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1c297e1-ec55-4113-a87d-7813a27c03d9" containerName="glance-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618430 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-auditor" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618450 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a20774f5-74f4-4f7f-9f33-b4b55585cb7d" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618464 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5aa1aef-afe2-4b70-9033-c62921f3d106" containerName="proxy-server" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618478 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="92213f20-28bf-4fe1-b547-6867677b0049" containerName="nova-cell1-conductor-conductor" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618489 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0f06117-94bf-4e56-b5f7-e83eda8ee811" containerName="mariadb-account-delete" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618502 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-reaper" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618512 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-auditor" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618526 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="62795e13-2e9c-4656-ab88-8788e50d37c5" containerName="neutron-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618537 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4629ae3-d685-43c9-81fd-49e84abd427f" containerName="nova-api-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618550 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="00e05134-e159-40fe-9c63-a0dc406c8dee" containerName="ovn-northd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618559 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="rsync" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618571 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f7d4f35-145c-4af9-9f4b-de8700877370" containerName="barbican-worker-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618597 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6122ff69-d6fb-4002-8679-80b826faf58f" containerName="barbican-keystone-listener" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618605 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="701b75e6-1acc-47d0-85de-2349a6345a3b" containerName="openstack-network-exporter" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618621 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="object-updater" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618634 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba8f9be-99cb-4173-aa3b-f8ba2aabb57c" containerName="cinder-scheduler" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618644 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="701b75e6-1acc-47d0-85de-2349a6345a3b" containerName="ovsdbserver-sb" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618653 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="00e05134-e159-40fe-9c63-a0dc406c8dee" containerName="openstack-network-exporter" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618662 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="894c1f04-42d4-43de-a34a-19200ceec426" containerName="galera" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618673 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a163bab0-7bd2-4272-a1f0-cd0090eed141" containerName="barbican-api-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618684 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f0068c-4e61-4079-9d62-b338472e817d" containerName="ovn-controller" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618696 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1c297e1-ec55-4113-a87d-7813a27c03d9" containerName="glance-httpd" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618705 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="caf766d3-49fe-4a20-bf0e-405ccca15c69" containerName="nova-cell1-novncproxy-novncproxy" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618719 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="37143980-a3f8-4398-a1d7-0f8189fb5366" containerName="glance-log" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618729 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="62795e13-2e9c-4656-ab88-8788e50d37c5" containerName="neutron-api" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618741 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="9087d728-8ea1-4f0c-aff6-7dae2fd139ec" containerName="openstack-network-exporter" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618749 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="account-auditor" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618761 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ab95611-95ff-46bf-9b06-2ed44a58fa46" containerName="container-server" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.618774 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="168e9a74-197a-4210-a553-7162c2f521af" containerName="memcached" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.620651 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.632567 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rpvcv"] Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.778321 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-utilities\") pod \"redhat-operators-rpvcv\" (UID: \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\") " pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.778472 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-catalog-content\") pod \"redhat-operators-rpvcv\" (UID: \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\") " pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.778577 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99zb8\" (UniqueName: \"kubernetes.io/projected/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-kube-api-access-99zb8\") pod \"redhat-operators-rpvcv\" (UID: \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\") " pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.879691 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-catalog-content\") pod \"redhat-operators-rpvcv\" (UID: \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\") " pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.879767 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99zb8\" (UniqueName: \"kubernetes.io/projected/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-kube-api-access-99zb8\") pod \"redhat-operators-rpvcv\" (UID: \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\") " pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.879840 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-utilities\") pod \"redhat-operators-rpvcv\" (UID: \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\") " pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.880432 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-catalog-content\") pod \"redhat-operators-rpvcv\" (UID: \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\") " pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.880506 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-utilities\") pod \"redhat-operators-rpvcv\" (UID: \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\") " pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.915721 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99zb8\" (UniqueName: \"kubernetes.io/projected/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-kube-api-access-99zb8\") pod \"redhat-operators-rpvcv\" (UID: \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\") " pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:23 crc kubenswrapper[4834]: I1008 22:47:23.983923 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:24 crc kubenswrapper[4834]: I1008 22:47:24.233334 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rpvcv"] Oct 08 22:47:25 crc kubenswrapper[4834]: I1008 22:47:25.213010 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rpvcv" event={"ID":"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f","Type":"ContainerStarted","Data":"d3e185854e1285944ea5e6851e58f8ef26b22f77662f7ebf2287bf2c2eed3fda"} Oct 08 22:47:25 crc kubenswrapper[4834]: I1008 22:47:25.213485 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rpvcv" event={"ID":"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f","Type":"ContainerStarted","Data":"cd1940a123977524c46f19026f3442949f67f6bfc2bdf6ac98454c89c720c1e8"} Oct 08 22:47:26 crc kubenswrapper[4834]: I1008 22:47:26.226400 4834 generic.go:334] "Generic (PLEG): container finished" podID="fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" containerID="d3e185854e1285944ea5e6851e58f8ef26b22f77662f7ebf2287bf2c2eed3fda" exitCode=0 Oct 08 22:47:26 crc kubenswrapper[4834]: I1008 22:47:26.226466 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rpvcv" event={"ID":"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f","Type":"ContainerDied","Data":"d3e185854e1285944ea5e6851e58f8ef26b22f77662f7ebf2287bf2c2eed3fda"} Oct 08 22:47:28 crc kubenswrapper[4834]: I1008 22:47:28.250008 4834 generic.go:334] "Generic (PLEG): container finished" podID="fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" containerID="0809ed854df12c34d1bbd4ba11e2343fa00fce62edd2325516ab772122131c50" exitCode=0 Oct 08 22:47:28 crc kubenswrapper[4834]: I1008 22:47:28.250128 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rpvcv" event={"ID":"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f","Type":"ContainerDied","Data":"0809ed854df12c34d1bbd4ba11e2343fa00fce62edd2325516ab772122131c50"} Oct 08 22:47:29 crc kubenswrapper[4834]: I1008 22:47:29.263019 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rpvcv" event={"ID":"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f","Type":"ContainerStarted","Data":"d29489734f6f1c6907c3a2d24d412ec5e83e213da6a4066f792e78a3bbdb819f"} Oct 08 22:47:29 crc kubenswrapper[4834]: I1008 22:47:29.290401 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rpvcv" podStartSLOduration=3.568495613 podStartE2EDuration="6.290383173s" podCreationTimestamp="2025-10-08 22:47:23 +0000 UTC" firstStartedPulling="2025-10-08 22:47:26.228533166 +0000 UTC m=+1454.051417912" lastFinishedPulling="2025-10-08 22:47:28.950420716 +0000 UTC m=+1456.773305472" observedRunningTime="2025-10-08 22:47:29.287125224 +0000 UTC m=+1457.110010000" watchObservedRunningTime="2025-10-08 22:47:29.290383173 +0000 UTC m=+1457.113267929" Oct 08 22:47:30 crc kubenswrapper[4834]: I1008 22:47:30.981396 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h9lt7"] Oct 08 22:47:30 crc kubenswrapper[4834]: I1008 22:47:30.983127 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:31 crc kubenswrapper[4834]: I1008 22:47:31.024198 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9lt7"] Oct 08 22:47:31 crc kubenswrapper[4834]: I1008 22:47:31.090068 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27b15995-e7ab-4e7b-ba42-6242f4833f1c-utilities\") pod \"certified-operators-h9lt7\" (UID: \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\") " pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:31 crc kubenswrapper[4834]: I1008 22:47:31.090129 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27b15995-e7ab-4e7b-ba42-6242f4833f1c-catalog-content\") pod \"certified-operators-h9lt7\" (UID: \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\") " pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:31 crc kubenswrapper[4834]: I1008 22:47:31.090182 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvjvc\" (UniqueName: \"kubernetes.io/projected/27b15995-e7ab-4e7b-ba42-6242f4833f1c-kube-api-access-wvjvc\") pod \"certified-operators-h9lt7\" (UID: \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\") " pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:31 crc kubenswrapper[4834]: I1008 22:47:31.191932 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27b15995-e7ab-4e7b-ba42-6242f4833f1c-utilities\") pod \"certified-operators-h9lt7\" (UID: \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\") " pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:31 crc kubenswrapper[4834]: I1008 22:47:31.191980 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27b15995-e7ab-4e7b-ba42-6242f4833f1c-catalog-content\") pod \"certified-operators-h9lt7\" (UID: \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\") " pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:31 crc kubenswrapper[4834]: I1008 22:47:31.192016 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvjvc\" (UniqueName: \"kubernetes.io/projected/27b15995-e7ab-4e7b-ba42-6242f4833f1c-kube-api-access-wvjvc\") pod \"certified-operators-h9lt7\" (UID: \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\") " pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:31 crc kubenswrapper[4834]: I1008 22:47:31.192525 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27b15995-e7ab-4e7b-ba42-6242f4833f1c-utilities\") pod \"certified-operators-h9lt7\" (UID: \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\") " pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:31 crc kubenswrapper[4834]: I1008 22:47:31.192667 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27b15995-e7ab-4e7b-ba42-6242f4833f1c-catalog-content\") pod \"certified-operators-h9lt7\" (UID: \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\") " pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:31 crc kubenswrapper[4834]: I1008 22:47:31.220392 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvjvc\" (UniqueName: \"kubernetes.io/projected/27b15995-e7ab-4e7b-ba42-6242f4833f1c-kube-api-access-wvjvc\") pod \"certified-operators-h9lt7\" (UID: \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\") " pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:31 crc kubenswrapper[4834]: I1008 22:47:31.311620 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:31 crc kubenswrapper[4834]: I1008 22:47:31.803254 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9lt7"] Oct 08 22:47:32 crc kubenswrapper[4834]: I1008 22:47:32.288742 4834 generic.go:334] "Generic (PLEG): container finished" podID="27b15995-e7ab-4e7b-ba42-6242f4833f1c" containerID="9610223ce46e63a4707f6bc03a8e8f8fdd3d38fbd7a9f9f87b8bff957ab9ecf4" exitCode=0 Oct 08 22:47:32 crc kubenswrapper[4834]: I1008 22:47:32.288851 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9lt7" event={"ID":"27b15995-e7ab-4e7b-ba42-6242f4833f1c","Type":"ContainerDied","Data":"9610223ce46e63a4707f6bc03a8e8f8fdd3d38fbd7a9f9f87b8bff957ab9ecf4"} Oct 08 22:47:32 crc kubenswrapper[4834]: I1008 22:47:32.289009 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9lt7" event={"ID":"27b15995-e7ab-4e7b-ba42-6242f4833f1c","Type":"ContainerStarted","Data":"9d2a6d6b0ea83f2f8fddd8c9d604052a166bfbeb55629e66e8bb227b43d02cb8"} Oct 08 22:47:33 crc kubenswrapper[4834]: I1008 22:47:33.984278 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:33 crc kubenswrapper[4834]: I1008 22:47:33.984347 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:34 crc kubenswrapper[4834]: I1008 22:47:34.311825 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9lt7" event={"ID":"27b15995-e7ab-4e7b-ba42-6242f4833f1c","Type":"ContainerStarted","Data":"28d33a005301f08f61041837e6f1980680873ce2ccb9535276b47b5e0464ca01"} Oct 08 22:47:35 crc kubenswrapper[4834]: I1008 22:47:35.055305 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rpvcv" podUID="fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" containerName="registry-server" probeResult="failure" output=< Oct 08 22:47:35 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Oct 08 22:47:35 crc kubenswrapper[4834]: > Oct 08 22:47:35 crc kubenswrapper[4834]: I1008 22:47:35.325530 4834 generic.go:334] "Generic (PLEG): container finished" podID="27b15995-e7ab-4e7b-ba42-6242f4833f1c" containerID="28d33a005301f08f61041837e6f1980680873ce2ccb9535276b47b5e0464ca01" exitCode=0 Oct 08 22:47:35 crc kubenswrapper[4834]: I1008 22:47:35.325601 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9lt7" event={"ID":"27b15995-e7ab-4e7b-ba42-6242f4833f1c","Type":"ContainerDied","Data":"28d33a005301f08f61041837e6f1980680873ce2ccb9535276b47b5e0464ca01"} Oct 08 22:47:36 crc kubenswrapper[4834]: I1008 22:47:36.336325 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9lt7" event={"ID":"27b15995-e7ab-4e7b-ba42-6242f4833f1c","Type":"ContainerStarted","Data":"d5493534a7e272a2edb68743dbad7a65558d399f10f62ec5d384fe6fb6a9e812"} Oct 08 22:47:36 crc kubenswrapper[4834]: I1008 22:47:36.360904 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h9lt7" podStartSLOduration=2.705194564 podStartE2EDuration="6.36088418s" podCreationTimestamp="2025-10-08 22:47:30 +0000 UTC" firstStartedPulling="2025-10-08 22:47:32.291692227 +0000 UTC m=+1460.114576983" lastFinishedPulling="2025-10-08 22:47:35.947381843 +0000 UTC m=+1463.770266599" observedRunningTime="2025-10-08 22:47:36.358138233 +0000 UTC m=+1464.181022979" watchObservedRunningTime="2025-10-08 22:47:36.36088418 +0000 UTC m=+1464.183768936" Oct 08 22:47:41 crc kubenswrapper[4834]: I1008 22:47:41.311772 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:41 crc kubenswrapper[4834]: I1008 22:47:41.312424 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:41 crc kubenswrapper[4834]: I1008 22:47:41.358136 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:41 crc kubenswrapper[4834]: I1008 22:47:41.435823 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:41 crc kubenswrapper[4834]: I1008 22:47:41.598547 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9lt7"] Oct 08 22:47:43 crc kubenswrapper[4834]: I1008 22:47:43.396488 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h9lt7" podUID="27b15995-e7ab-4e7b-ba42-6242f4833f1c" containerName="registry-server" containerID="cri-o://d5493534a7e272a2edb68743dbad7a65558d399f10f62ec5d384fe6fb6a9e812" gracePeriod=2 Oct 08 22:47:43 crc kubenswrapper[4834]: I1008 22:47:43.855209 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:43 crc kubenswrapper[4834]: I1008 22:47:43.992846 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvjvc\" (UniqueName: \"kubernetes.io/projected/27b15995-e7ab-4e7b-ba42-6242f4833f1c-kube-api-access-wvjvc\") pod \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\" (UID: \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\") " Oct 08 22:47:43 crc kubenswrapper[4834]: I1008 22:47:43.992938 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27b15995-e7ab-4e7b-ba42-6242f4833f1c-utilities\") pod \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\" (UID: \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\") " Oct 08 22:47:43 crc kubenswrapper[4834]: I1008 22:47:43.993068 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27b15995-e7ab-4e7b-ba42-6242f4833f1c-catalog-content\") pod \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\" (UID: \"27b15995-e7ab-4e7b-ba42-6242f4833f1c\") " Oct 08 22:47:43 crc kubenswrapper[4834]: I1008 22:47:43.997734 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27b15995-e7ab-4e7b-ba42-6242f4833f1c-utilities" (OuterVolumeSpecName: "utilities") pod "27b15995-e7ab-4e7b-ba42-6242f4833f1c" (UID: "27b15995-e7ab-4e7b-ba42-6242f4833f1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.002389 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27b15995-e7ab-4e7b-ba42-6242f4833f1c-kube-api-access-wvjvc" (OuterVolumeSpecName: "kube-api-access-wvjvc") pod "27b15995-e7ab-4e7b-ba42-6242f4833f1c" (UID: "27b15995-e7ab-4e7b-ba42-6242f4833f1c"). InnerVolumeSpecName "kube-api-access-wvjvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.042632 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.046653 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27b15995-e7ab-4e7b-ba42-6242f4833f1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27b15995-e7ab-4e7b-ba42-6242f4833f1c" (UID: "27b15995-e7ab-4e7b-ba42-6242f4833f1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.095215 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvjvc\" (UniqueName: \"kubernetes.io/projected/27b15995-e7ab-4e7b-ba42-6242f4833f1c-kube-api-access-wvjvc\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.095254 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27b15995-e7ab-4e7b-ba42-6242f4833f1c-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.095268 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27b15995-e7ab-4e7b-ba42-6242f4833f1c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.101483 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.410401 4834 generic.go:334] "Generic (PLEG): container finished" podID="27b15995-e7ab-4e7b-ba42-6242f4833f1c" containerID="d5493534a7e272a2edb68743dbad7a65558d399f10f62ec5d384fe6fb6a9e812" exitCode=0 Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.411065 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9lt7" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.411322 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9lt7" event={"ID":"27b15995-e7ab-4e7b-ba42-6242f4833f1c","Type":"ContainerDied","Data":"d5493534a7e272a2edb68743dbad7a65558d399f10f62ec5d384fe6fb6a9e812"} Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.411378 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9lt7" event={"ID":"27b15995-e7ab-4e7b-ba42-6242f4833f1c","Type":"ContainerDied","Data":"9d2a6d6b0ea83f2f8fddd8c9d604052a166bfbeb55629e66e8bb227b43d02cb8"} Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.411402 4834 scope.go:117] "RemoveContainer" containerID="d5493534a7e272a2edb68743dbad7a65558d399f10f62ec5d384fe6fb6a9e812" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.446229 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9lt7"] Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.451645 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h9lt7"] Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.454693 4834 scope.go:117] "RemoveContainer" containerID="28d33a005301f08f61041837e6f1980680873ce2ccb9535276b47b5e0464ca01" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.488917 4834 scope.go:117] "RemoveContainer" containerID="9610223ce46e63a4707f6bc03a8e8f8fdd3d38fbd7a9f9f87b8bff957ab9ecf4" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.508954 4834 scope.go:117] "RemoveContainer" containerID="d5493534a7e272a2edb68743dbad7a65558d399f10f62ec5d384fe6fb6a9e812" Oct 08 22:47:44 crc kubenswrapper[4834]: E1008 22:47:44.509404 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5493534a7e272a2edb68743dbad7a65558d399f10f62ec5d384fe6fb6a9e812\": container with ID starting with d5493534a7e272a2edb68743dbad7a65558d399f10f62ec5d384fe6fb6a9e812 not found: ID does not exist" containerID="d5493534a7e272a2edb68743dbad7a65558d399f10f62ec5d384fe6fb6a9e812" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.509440 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5493534a7e272a2edb68743dbad7a65558d399f10f62ec5d384fe6fb6a9e812"} err="failed to get container status \"d5493534a7e272a2edb68743dbad7a65558d399f10f62ec5d384fe6fb6a9e812\": rpc error: code = NotFound desc = could not find container \"d5493534a7e272a2edb68743dbad7a65558d399f10f62ec5d384fe6fb6a9e812\": container with ID starting with d5493534a7e272a2edb68743dbad7a65558d399f10f62ec5d384fe6fb6a9e812 not found: ID does not exist" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.509462 4834 scope.go:117] "RemoveContainer" containerID="28d33a005301f08f61041837e6f1980680873ce2ccb9535276b47b5e0464ca01" Oct 08 22:47:44 crc kubenswrapper[4834]: E1008 22:47:44.509737 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28d33a005301f08f61041837e6f1980680873ce2ccb9535276b47b5e0464ca01\": container with ID starting with 28d33a005301f08f61041837e6f1980680873ce2ccb9535276b47b5e0464ca01 not found: ID does not exist" containerID="28d33a005301f08f61041837e6f1980680873ce2ccb9535276b47b5e0464ca01" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.509759 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28d33a005301f08f61041837e6f1980680873ce2ccb9535276b47b5e0464ca01"} err="failed to get container status \"28d33a005301f08f61041837e6f1980680873ce2ccb9535276b47b5e0464ca01\": rpc error: code = NotFound desc = could not find container \"28d33a005301f08f61041837e6f1980680873ce2ccb9535276b47b5e0464ca01\": container with ID starting with 28d33a005301f08f61041837e6f1980680873ce2ccb9535276b47b5e0464ca01 not found: ID does not exist" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.509774 4834 scope.go:117] "RemoveContainer" containerID="9610223ce46e63a4707f6bc03a8e8f8fdd3d38fbd7a9f9f87b8bff957ab9ecf4" Oct 08 22:47:44 crc kubenswrapper[4834]: E1008 22:47:44.510031 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9610223ce46e63a4707f6bc03a8e8f8fdd3d38fbd7a9f9f87b8bff957ab9ecf4\": container with ID starting with 9610223ce46e63a4707f6bc03a8e8f8fdd3d38fbd7a9f9f87b8bff957ab9ecf4 not found: ID does not exist" containerID="9610223ce46e63a4707f6bc03a8e8f8fdd3d38fbd7a9f9f87b8bff957ab9ecf4" Oct 08 22:47:44 crc kubenswrapper[4834]: I1008 22:47:44.510054 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9610223ce46e63a4707f6bc03a8e8f8fdd3d38fbd7a9f9f87b8bff957ab9ecf4"} err="failed to get container status \"9610223ce46e63a4707f6bc03a8e8f8fdd3d38fbd7a9f9f87b8bff957ab9ecf4\": rpc error: code = NotFound desc = could not find container \"9610223ce46e63a4707f6bc03a8e8f8fdd3d38fbd7a9f9f87b8bff957ab9ecf4\": container with ID starting with 9610223ce46e63a4707f6bc03a8e8f8fdd3d38fbd7a9f9f87b8bff957ab9ecf4 not found: ID does not exist" Oct 08 22:47:45 crc kubenswrapper[4834]: I1008 22:47:45.401105 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rpvcv"] Oct 08 22:47:45 crc kubenswrapper[4834]: I1008 22:47:45.427760 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rpvcv" podUID="fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" containerName="registry-server" containerID="cri-o://d29489734f6f1c6907c3a2d24d412ec5e83e213da6a4066f792e78a3bbdb819f" gracePeriod=2 Oct 08 22:47:45 crc kubenswrapper[4834]: I1008 22:47:45.565910 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27b15995-e7ab-4e7b-ba42-6242f4833f1c" path="/var/lib/kubelet/pods/27b15995-e7ab-4e7b-ba42-6242f4833f1c/volumes" Oct 08 22:47:45 crc kubenswrapper[4834]: I1008 22:47:45.979907 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.123648 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99zb8\" (UniqueName: \"kubernetes.io/projected/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-kube-api-access-99zb8\") pod \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\" (UID: \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\") " Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.123740 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-utilities\") pod \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\" (UID: \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\") " Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.123801 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-catalog-content\") pod \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\" (UID: \"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f\") " Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.125455 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-utilities" (OuterVolumeSpecName: "utilities") pod "fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" (UID: "fbf0a754-ab6e-4d0f-ba51-7a33e23d505f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.129421 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-kube-api-access-99zb8" (OuterVolumeSpecName: "kube-api-access-99zb8") pod "fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" (UID: "fbf0a754-ab6e-4d0f-ba51-7a33e23d505f"). InnerVolumeSpecName "kube-api-access-99zb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.223374 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" (UID: "fbf0a754-ab6e-4d0f-ba51-7a33e23d505f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.225690 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99zb8\" (UniqueName: \"kubernetes.io/projected/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-kube-api-access-99zb8\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.225713 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.225724 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.443390 4834 generic.go:334] "Generic (PLEG): container finished" podID="fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" containerID="d29489734f6f1c6907c3a2d24d412ec5e83e213da6a4066f792e78a3bbdb819f" exitCode=0 Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.443462 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rpvcv" event={"ID":"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f","Type":"ContainerDied","Data":"d29489734f6f1c6907c3a2d24d412ec5e83e213da6a4066f792e78a3bbdb819f"} Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.443554 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rpvcv" event={"ID":"fbf0a754-ab6e-4d0f-ba51-7a33e23d505f","Type":"ContainerDied","Data":"cd1940a123977524c46f19026f3442949f67f6bfc2bdf6ac98454c89c720c1e8"} Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.443590 4834 scope.go:117] "RemoveContainer" containerID="d29489734f6f1c6907c3a2d24d412ec5e83e213da6a4066f792e78a3bbdb819f" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.443744 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rpvcv" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.477807 4834 scope.go:117] "RemoveContainer" containerID="0809ed854df12c34d1bbd4ba11e2343fa00fce62edd2325516ab772122131c50" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.507682 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rpvcv"] Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.516011 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rpvcv"] Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.518429 4834 scope.go:117] "RemoveContainer" containerID="d3e185854e1285944ea5e6851e58f8ef26b22f77662f7ebf2287bf2c2eed3fda" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.551614 4834 scope.go:117] "RemoveContainer" containerID="d29489734f6f1c6907c3a2d24d412ec5e83e213da6a4066f792e78a3bbdb819f" Oct 08 22:47:46 crc kubenswrapper[4834]: E1008 22:47:46.552212 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29489734f6f1c6907c3a2d24d412ec5e83e213da6a4066f792e78a3bbdb819f\": container with ID starting with d29489734f6f1c6907c3a2d24d412ec5e83e213da6a4066f792e78a3bbdb819f not found: ID does not exist" containerID="d29489734f6f1c6907c3a2d24d412ec5e83e213da6a4066f792e78a3bbdb819f" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.552260 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29489734f6f1c6907c3a2d24d412ec5e83e213da6a4066f792e78a3bbdb819f"} err="failed to get container status \"d29489734f6f1c6907c3a2d24d412ec5e83e213da6a4066f792e78a3bbdb819f\": rpc error: code = NotFound desc = could not find container \"d29489734f6f1c6907c3a2d24d412ec5e83e213da6a4066f792e78a3bbdb819f\": container with ID starting with d29489734f6f1c6907c3a2d24d412ec5e83e213da6a4066f792e78a3bbdb819f not found: ID does not exist" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.552291 4834 scope.go:117] "RemoveContainer" containerID="0809ed854df12c34d1bbd4ba11e2343fa00fce62edd2325516ab772122131c50" Oct 08 22:47:46 crc kubenswrapper[4834]: E1008 22:47:46.552750 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0809ed854df12c34d1bbd4ba11e2343fa00fce62edd2325516ab772122131c50\": container with ID starting with 0809ed854df12c34d1bbd4ba11e2343fa00fce62edd2325516ab772122131c50 not found: ID does not exist" containerID="0809ed854df12c34d1bbd4ba11e2343fa00fce62edd2325516ab772122131c50" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.552841 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0809ed854df12c34d1bbd4ba11e2343fa00fce62edd2325516ab772122131c50"} err="failed to get container status \"0809ed854df12c34d1bbd4ba11e2343fa00fce62edd2325516ab772122131c50\": rpc error: code = NotFound desc = could not find container \"0809ed854df12c34d1bbd4ba11e2343fa00fce62edd2325516ab772122131c50\": container with ID starting with 0809ed854df12c34d1bbd4ba11e2343fa00fce62edd2325516ab772122131c50 not found: ID does not exist" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.552894 4834 scope.go:117] "RemoveContainer" containerID="d3e185854e1285944ea5e6851e58f8ef26b22f77662f7ebf2287bf2c2eed3fda" Oct 08 22:47:46 crc kubenswrapper[4834]: E1008 22:47:46.553738 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3e185854e1285944ea5e6851e58f8ef26b22f77662f7ebf2287bf2c2eed3fda\": container with ID starting with d3e185854e1285944ea5e6851e58f8ef26b22f77662f7ebf2287bf2c2eed3fda not found: ID does not exist" containerID="d3e185854e1285944ea5e6851e58f8ef26b22f77662f7ebf2287bf2c2eed3fda" Oct 08 22:47:46 crc kubenswrapper[4834]: I1008 22:47:46.553786 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3e185854e1285944ea5e6851e58f8ef26b22f77662f7ebf2287bf2c2eed3fda"} err="failed to get container status \"d3e185854e1285944ea5e6851e58f8ef26b22f77662f7ebf2287bf2c2eed3fda\": rpc error: code = NotFound desc = could not find container \"d3e185854e1285944ea5e6851e58f8ef26b22f77662f7ebf2287bf2c2eed3fda\": container with ID starting with d3e185854e1285944ea5e6851e58f8ef26b22f77662f7ebf2287bf2c2eed3fda not found: ID does not exist" Oct 08 22:47:47 crc kubenswrapper[4834]: I1008 22:47:47.571856 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" path="/var/lib/kubelet/pods/fbf0a754-ab6e-4d0f-ba51-7a33e23d505f/volumes" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.083797 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fhwzl"] Oct 08 22:47:58 crc kubenswrapper[4834]: E1008 22:47:58.084518 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b15995-e7ab-4e7b-ba42-6242f4833f1c" containerName="extract-utilities" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.084529 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b15995-e7ab-4e7b-ba42-6242f4833f1c" containerName="extract-utilities" Oct 08 22:47:58 crc kubenswrapper[4834]: E1008 22:47:58.084547 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b15995-e7ab-4e7b-ba42-6242f4833f1c" containerName="extract-content" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.084554 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b15995-e7ab-4e7b-ba42-6242f4833f1c" containerName="extract-content" Oct 08 22:47:58 crc kubenswrapper[4834]: E1008 22:47:58.084566 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" containerName="extract-content" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.084574 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" containerName="extract-content" Oct 08 22:47:58 crc kubenswrapper[4834]: E1008 22:47:58.084885 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" containerName="extract-utilities" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.084895 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" containerName="extract-utilities" Oct 08 22:47:58 crc kubenswrapper[4834]: E1008 22:47:58.084907 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" containerName="registry-server" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.084914 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" containerName="registry-server" Oct 08 22:47:58 crc kubenswrapper[4834]: E1008 22:47:58.084927 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b15995-e7ab-4e7b-ba42-6242f4833f1c" containerName="registry-server" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.084934 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b15995-e7ab-4e7b-ba42-6242f4833f1c" containerName="registry-server" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.085069 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b15995-e7ab-4e7b-ba42-6242f4833f1c" containerName="registry-server" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.085086 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbf0a754-ab6e-4d0f-ba51-7a33e23d505f" containerName="registry-server" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.086008 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.109708 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhwzl"] Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.208729 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f8vr\" (UniqueName: \"kubernetes.io/projected/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-kube-api-access-8f8vr\") pod \"redhat-marketplace-fhwzl\" (UID: \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\") " pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.208965 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-catalog-content\") pod \"redhat-marketplace-fhwzl\" (UID: \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\") " pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.209118 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-utilities\") pod \"redhat-marketplace-fhwzl\" (UID: \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\") " pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.310945 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f8vr\" (UniqueName: \"kubernetes.io/projected/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-kube-api-access-8f8vr\") pod \"redhat-marketplace-fhwzl\" (UID: \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\") " pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.311076 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-catalog-content\") pod \"redhat-marketplace-fhwzl\" (UID: \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\") " pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.311115 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-utilities\") pod \"redhat-marketplace-fhwzl\" (UID: \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\") " pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.311663 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-utilities\") pod \"redhat-marketplace-fhwzl\" (UID: \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\") " pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.311766 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-catalog-content\") pod \"redhat-marketplace-fhwzl\" (UID: \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\") " pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.334278 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f8vr\" (UniqueName: \"kubernetes.io/projected/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-kube-api-access-8f8vr\") pod \"redhat-marketplace-fhwzl\" (UID: \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\") " pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.404698 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:47:58 crc kubenswrapper[4834]: I1008 22:47:58.837840 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhwzl"] Oct 08 22:47:59 crc kubenswrapper[4834]: I1008 22:47:59.578685 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7dbb2d6-d8c3-4626-9938-18b7631ecaad" containerID="600efb11bb7ebdd0cadb01fb8b3927ad2c6fa7c83be9fcdd3c6d3a693889ad18" exitCode=0 Oct 08 22:47:59 crc kubenswrapper[4834]: I1008 22:47:59.578800 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhwzl" event={"ID":"e7dbb2d6-d8c3-4626-9938-18b7631ecaad","Type":"ContainerDied","Data":"600efb11bb7ebdd0cadb01fb8b3927ad2c6fa7c83be9fcdd3c6d3a693889ad18"} Oct 08 22:47:59 crc kubenswrapper[4834]: I1008 22:47:59.579008 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhwzl" event={"ID":"e7dbb2d6-d8c3-4626-9938-18b7631ecaad","Type":"ContainerStarted","Data":"da67973e57c975271ebb520328772bffb6e6c66401ab8fb9ab86ce5b22a4ef32"} Oct 08 22:48:00 crc kubenswrapper[4834]: I1008 22:48:00.592020 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhwzl" event={"ID":"e7dbb2d6-d8c3-4626-9938-18b7631ecaad","Type":"ContainerStarted","Data":"ff9a0c312385f2d5dd74a5cc8c9c33e459bc13c51c07343ef76dadb597ae0bd9"} Oct 08 22:48:01 crc kubenswrapper[4834]: I1008 22:48:01.604445 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7dbb2d6-d8c3-4626-9938-18b7631ecaad" containerID="ff9a0c312385f2d5dd74a5cc8c9c33e459bc13c51c07343ef76dadb597ae0bd9" exitCode=0 Oct 08 22:48:01 crc kubenswrapper[4834]: I1008 22:48:01.604548 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhwzl" event={"ID":"e7dbb2d6-d8c3-4626-9938-18b7631ecaad","Type":"ContainerDied","Data":"ff9a0c312385f2d5dd74a5cc8c9c33e459bc13c51c07343ef76dadb597ae0bd9"} Oct 08 22:48:02 crc kubenswrapper[4834]: I1008 22:48:02.621376 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhwzl" event={"ID":"e7dbb2d6-d8c3-4626-9938-18b7631ecaad","Type":"ContainerStarted","Data":"f09dffc59216b940c977be879ce2a529adcf6337aa980604f2a7adca3c9e57d0"} Oct 08 22:48:02 crc kubenswrapper[4834]: I1008 22:48:02.646852 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fhwzl" podStartSLOduration=2.189553292 podStartE2EDuration="4.64683583s" podCreationTimestamp="2025-10-08 22:47:58 +0000 UTC" firstStartedPulling="2025-10-08 22:47:59.581665807 +0000 UTC m=+1487.404550593" lastFinishedPulling="2025-10-08 22:48:02.038948385 +0000 UTC m=+1489.861833131" observedRunningTime="2025-10-08 22:48:02.644414781 +0000 UTC m=+1490.467299527" watchObservedRunningTime="2025-10-08 22:48:02.64683583 +0000 UTC m=+1490.469720576" Oct 08 22:48:08 crc kubenswrapper[4834]: I1008 22:48:08.405333 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:48:08 crc kubenswrapper[4834]: I1008 22:48:08.406113 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:48:08 crc kubenswrapper[4834]: I1008 22:48:08.488997 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:48:08 crc kubenswrapper[4834]: I1008 22:48:08.734663 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:48:08 crc kubenswrapper[4834]: I1008 22:48:08.800991 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhwzl"] Oct 08 22:48:10 crc kubenswrapper[4834]: I1008 22:48:10.712531 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fhwzl" podUID="e7dbb2d6-d8c3-4626-9938-18b7631ecaad" containerName="registry-server" containerID="cri-o://f09dffc59216b940c977be879ce2a529adcf6337aa980604f2a7adca3c9e57d0" gracePeriod=2 Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.335613 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.526897 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-utilities\") pod \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\" (UID: \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\") " Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.526979 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-catalog-content\") pod \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\" (UID: \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\") " Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.527236 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f8vr\" (UniqueName: \"kubernetes.io/projected/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-kube-api-access-8f8vr\") pod \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\" (UID: \"e7dbb2d6-d8c3-4626-9938-18b7631ecaad\") " Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.529279 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-utilities" (OuterVolumeSpecName: "utilities") pod "e7dbb2d6-d8c3-4626-9938-18b7631ecaad" (UID: "e7dbb2d6-d8c3-4626-9938-18b7631ecaad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.536073 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-kube-api-access-8f8vr" (OuterVolumeSpecName: "kube-api-access-8f8vr") pod "e7dbb2d6-d8c3-4626-9938-18b7631ecaad" (UID: "e7dbb2d6-d8c3-4626-9938-18b7631ecaad"). InnerVolumeSpecName "kube-api-access-8f8vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.633493 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f8vr\" (UniqueName: \"kubernetes.io/projected/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-kube-api-access-8f8vr\") on node \"crc\" DevicePath \"\"" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.633556 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.728263 4834 generic.go:334] "Generic (PLEG): container finished" podID="e7dbb2d6-d8c3-4626-9938-18b7631ecaad" containerID="f09dffc59216b940c977be879ce2a529adcf6337aa980604f2a7adca3c9e57d0" exitCode=0 Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.728331 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhwzl" event={"ID":"e7dbb2d6-d8c3-4626-9938-18b7631ecaad","Type":"ContainerDied","Data":"f09dffc59216b940c977be879ce2a529adcf6337aa980604f2a7adca3c9e57d0"} Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.728414 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhwzl" event={"ID":"e7dbb2d6-d8c3-4626-9938-18b7631ecaad","Type":"ContainerDied","Data":"da67973e57c975271ebb520328772bffb6e6c66401ab8fb9ab86ce5b22a4ef32"} Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.728464 4834 scope.go:117] "RemoveContainer" containerID="f09dffc59216b940c977be879ce2a529adcf6337aa980604f2a7adca3c9e57d0" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.728588 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhwzl" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.766073 4834 scope.go:117] "RemoveContainer" containerID="ff9a0c312385f2d5dd74a5cc8c9c33e459bc13c51c07343ef76dadb597ae0bd9" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.805610 4834 scope.go:117] "RemoveContainer" containerID="600efb11bb7ebdd0cadb01fb8b3927ad2c6fa7c83be9fcdd3c6d3a693889ad18" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.842332 4834 scope.go:117] "RemoveContainer" containerID="f09dffc59216b940c977be879ce2a529adcf6337aa980604f2a7adca3c9e57d0" Oct 08 22:48:11 crc kubenswrapper[4834]: E1008 22:48:11.843024 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f09dffc59216b940c977be879ce2a529adcf6337aa980604f2a7adca3c9e57d0\": container with ID starting with f09dffc59216b940c977be879ce2a529adcf6337aa980604f2a7adca3c9e57d0 not found: ID does not exist" containerID="f09dffc59216b940c977be879ce2a529adcf6337aa980604f2a7adca3c9e57d0" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.843097 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f09dffc59216b940c977be879ce2a529adcf6337aa980604f2a7adca3c9e57d0"} err="failed to get container status \"f09dffc59216b940c977be879ce2a529adcf6337aa980604f2a7adca3c9e57d0\": rpc error: code = NotFound desc = could not find container \"f09dffc59216b940c977be879ce2a529adcf6337aa980604f2a7adca3c9e57d0\": container with ID starting with f09dffc59216b940c977be879ce2a529adcf6337aa980604f2a7adca3c9e57d0 not found: ID does not exist" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.843136 4834 scope.go:117] "RemoveContainer" containerID="ff9a0c312385f2d5dd74a5cc8c9c33e459bc13c51c07343ef76dadb597ae0bd9" Oct 08 22:48:11 crc kubenswrapper[4834]: E1008 22:48:11.843775 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff9a0c312385f2d5dd74a5cc8c9c33e459bc13c51c07343ef76dadb597ae0bd9\": container with ID starting with ff9a0c312385f2d5dd74a5cc8c9c33e459bc13c51c07343ef76dadb597ae0bd9 not found: ID does not exist" containerID="ff9a0c312385f2d5dd74a5cc8c9c33e459bc13c51c07343ef76dadb597ae0bd9" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.843840 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff9a0c312385f2d5dd74a5cc8c9c33e459bc13c51c07343ef76dadb597ae0bd9"} err="failed to get container status \"ff9a0c312385f2d5dd74a5cc8c9c33e459bc13c51c07343ef76dadb597ae0bd9\": rpc error: code = NotFound desc = could not find container \"ff9a0c312385f2d5dd74a5cc8c9c33e459bc13c51c07343ef76dadb597ae0bd9\": container with ID starting with ff9a0c312385f2d5dd74a5cc8c9c33e459bc13c51c07343ef76dadb597ae0bd9 not found: ID does not exist" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.843904 4834 scope.go:117] "RemoveContainer" containerID="600efb11bb7ebdd0cadb01fb8b3927ad2c6fa7c83be9fcdd3c6d3a693889ad18" Oct 08 22:48:11 crc kubenswrapper[4834]: E1008 22:48:11.844767 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"600efb11bb7ebdd0cadb01fb8b3927ad2c6fa7c83be9fcdd3c6d3a693889ad18\": container with ID starting with 600efb11bb7ebdd0cadb01fb8b3927ad2c6fa7c83be9fcdd3c6d3a693889ad18 not found: ID does not exist" containerID="600efb11bb7ebdd0cadb01fb8b3927ad2c6fa7c83be9fcdd3c6d3a693889ad18" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.844830 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"600efb11bb7ebdd0cadb01fb8b3927ad2c6fa7c83be9fcdd3c6d3a693889ad18"} err="failed to get container status \"600efb11bb7ebdd0cadb01fb8b3927ad2c6fa7c83be9fcdd3c6d3a693889ad18\": rpc error: code = NotFound desc = could not find container \"600efb11bb7ebdd0cadb01fb8b3927ad2c6fa7c83be9fcdd3c6d3a693889ad18\": container with ID starting with 600efb11bb7ebdd0cadb01fb8b3927ad2c6fa7c83be9fcdd3c6d3a693889ad18 not found: ID does not exist" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.906677 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7dbb2d6-d8c3-4626-9938-18b7631ecaad" (UID: "e7dbb2d6-d8c3-4626-9938-18b7631ecaad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:48:11 crc kubenswrapper[4834]: I1008 22:48:11.938122 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7dbb2d6-d8c3-4626-9938-18b7631ecaad-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:48:12 crc kubenswrapper[4834]: I1008 22:48:12.084192 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhwzl"] Oct 08 22:48:12 crc kubenswrapper[4834]: I1008 22:48:12.090078 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhwzl"] Oct 08 22:48:13 crc kubenswrapper[4834]: I1008 22:48:13.588246 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7dbb2d6-d8c3-4626-9938-18b7631ecaad" path="/var/lib/kubelet/pods/e7dbb2d6-d8c3-4626-9938-18b7631ecaad/volumes" Oct 08 22:48:17 crc kubenswrapper[4834]: I1008 22:48:17.025779 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:48:17 crc kubenswrapper[4834]: I1008 22:48:17.026121 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.268748 4834 scope.go:117] "RemoveContainer" containerID="c1ee0707b352d5c949d1b1752e202629de61e58089c1568cc17419ade6ddee20" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.330949 4834 scope.go:117] "RemoveContainer" containerID="be441ca6796f4a769b5bac5d97b9e63bade7cf01a581a51b7711992d0886f68a" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.359375 4834 scope.go:117] "RemoveContainer" containerID="aafc6f28f6192bb2e21a743fca46de49ec0f69b188ca6747d25a84008f13d7e4" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.392577 4834 scope.go:117] "RemoveContainer" containerID="95ac071ead20f06847c957bb59d09b617365f2bd85c8f5567ed056724b25d3f8" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.424687 4834 scope.go:117] "RemoveContainer" containerID="a7a8be0c1ab1b96c033a071774588a8aee8e700671079c515bad11ab7aef7811" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.457192 4834 scope.go:117] "RemoveContainer" containerID="cfae18dfe7163f5270b1b9a333bf53a39b350a54593ea7ad6ea84c4a926bd51d" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.491686 4834 scope.go:117] "RemoveContainer" containerID="c2e3f3cf71362e248e776b5437f379560fe494d7b88c76a620e0eb8ccd7a5466" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.533519 4834 scope.go:117] "RemoveContainer" containerID="af00c748dce9310bfc4c5c0af24bc0e0c9f62dba443f13c4b67ecf0ccef26abf" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.564533 4834 scope.go:117] "RemoveContainer" containerID="dd65c7ce1dc2a6b52c253c3d1d5fdcb37949e830b5563bff431e26264e98a133" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.597815 4834 scope.go:117] "RemoveContainer" containerID="26138348638c67ebd21b0a53f61f166b58c8148f9fe6ddcb4703f9285102299a" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.629259 4834 scope.go:117] "RemoveContainer" containerID="af52d1bc1732e50cef0ae70a87734710490b056806da58f0bfa53ce4b2072fa4" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.647551 4834 scope.go:117] "RemoveContainer" containerID="dfda234ecadd5a07767bab392e4e0bb6c663083ca044e85134c0836b10b98799" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.676843 4834 scope.go:117] "RemoveContainer" containerID="e4462766e55e41ac2f173aa20c7c41f1e35a51ebc38de144712b6577aab16b7d" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.700752 4834 scope.go:117] "RemoveContainer" containerID="4fabbc353f24c6c2298b39e11ea1fd36fa1a517790dad8dfc9cbf3232a5ba1e4" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.727575 4834 scope.go:117] "RemoveContainer" containerID="9aa7536da54254fbbacb96af2d84bae4ba72329943d5c430446435f27ee30123" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.775104 4834 scope.go:117] "RemoveContainer" containerID="d868a94140ac15e6896c96857c31093ceddf5d4d44594dedeb5f93723c331d7d" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.816175 4834 scope.go:117] "RemoveContainer" containerID="37af688ed90d76271eb63e211669328e52636154c9cb1ea6ab93a0c2ac6a207b" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.854548 4834 scope.go:117] "RemoveContainer" containerID="477d72b2a6782571ffb4926c62d280ced0b20678e4a5c010ea8640ecaae5b71b" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.909098 4834 scope.go:117] "RemoveContainer" containerID="251b7ba4f4bb3ba2c276ea693fd961db641b65c3db7647c9aed9146f4376a84b" Oct 08 22:48:19 crc kubenswrapper[4834]: I1008 22:48:19.939583 4834 scope.go:117] "RemoveContainer" containerID="299426ab604bd850b419cb8a481e6164770f8d6cf6ebc2e23b67097f996fed1d" Oct 08 22:48:20 crc kubenswrapper[4834]: I1008 22:48:20.000582 4834 scope.go:117] "RemoveContainer" containerID="0c9ac7a53523f40364b8d33dc72f9c865f268d08e573b6598312922e667c5174" Oct 08 22:48:47 crc kubenswrapper[4834]: I1008 22:48:47.026034 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:48:47 crc kubenswrapper[4834]: I1008 22:48:47.027024 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.041121 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rwjwl"] Oct 08 22:49:09 crc kubenswrapper[4834]: E1008 22:49:09.041989 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7dbb2d6-d8c3-4626-9938-18b7631ecaad" containerName="registry-server" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.042002 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7dbb2d6-d8c3-4626-9938-18b7631ecaad" containerName="registry-server" Oct 08 22:49:09 crc kubenswrapper[4834]: E1008 22:49:09.042032 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7dbb2d6-d8c3-4626-9938-18b7631ecaad" containerName="extract-utilities" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.042038 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7dbb2d6-d8c3-4626-9938-18b7631ecaad" containerName="extract-utilities" Oct 08 22:49:09 crc kubenswrapper[4834]: E1008 22:49:09.042050 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7dbb2d6-d8c3-4626-9938-18b7631ecaad" containerName="extract-content" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.042055 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7dbb2d6-d8c3-4626-9938-18b7631ecaad" containerName="extract-content" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.042203 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7dbb2d6-d8c3-4626-9938-18b7631ecaad" containerName="registry-server" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.043132 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.063048 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwjwl"] Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.081059 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-utilities\") pod \"community-operators-rwjwl\" (UID: \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\") " pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.081652 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvmsr\" (UniqueName: \"kubernetes.io/projected/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-kube-api-access-tvmsr\") pod \"community-operators-rwjwl\" (UID: \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\") " pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.081823 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-catalog-content\") pod \"community-operators-rwjwl\" (UID: \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\") " pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.182783 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvmsr\" (UniqueName: \"kubernetes.io/projected/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-kube-api-access-tvmsr\") pod \"community-operators-rwjwl\" (UID: \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\") " pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.182860 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-catalog-content\") pod \"community-operators-rwjwl\" (UID: \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\") " pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.182889 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-utilities\") pod \"community-operators-rwjwl\" (UID: \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\") " pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.183456 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-catalog-content\") pod \"community-operators-rwjwl\" (UID: \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\") " pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.183552 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-utilities\") pod \"community-operators-rwjwl\" (UID: \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\") " pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.211275 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvmsr\" (UniqueName: \"kubernetes.io/projected/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-kube-api-access-tvmsr\") pod \"community-operators-rwjwl\" (UID: \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\") " pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.374895 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:09 crc kubenswrapper[4834]: I1008 22:49:09.617900 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwjwl"] Oct 08 22:49:09 crc kubenswrapper[4834]: W1008 22:49:09.625771 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92b204b8_6f1e_4965_a656_0b7ffbdf03ec.slice/crio-aab85b9131afa4e7b7dfff4e7a0e81b1089340e29b812c071a3ce9409514467a WatchSource:0}: Error finding container aab85b9131afa4e7b7dfff4e7a0e81b1089340e29b812c071a3ce9409514467a: Status 404 returned error can't find the container with id aab85b9131afa4e7b7dfff4e7a0e81b1089340e29b812c071a3ce9409514467a Oct 08 22:49:10 crc kubenswrapper[4834]: I1008 22:49:10.349094 4834 generic.go:334] "Generic (PLEG): container finished" podID="92b204b8-6f1e-4965-a656-0b7ffbdf03ec" containerID="54f90fa1a18c048d499fdace93ae048415aff712c3ab36a735ef1a8276d61ced" exitCode=0 Oct 08 22:49:10 crc kubenswrapper[4834]: I1008 22:49:10.349227 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwjwl" event={"ID":"92b204b8-6f1e-4965-a656-0b7ffbdf03ec","Type":"ContainerDied","Data":"54f90fa1a18c048d499fdace93ae048415aff712c3ab36a735ef1a8276d61ced"} Oct 08 22:49:10 crc kubenswrapper[4834]: I1008 22:49:10.349267 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwjwl" event={"ID":"92b204b8-6f1e-4965-a656-0b7ffbdf03ec","Type":"ContainerStarted","Data":"aab85b9131afa4e7b7dfff4e7a0e81b1089340e29b812c071a3ce9409514467a"} Oct 08 22:49:11 crc kubenswrapper[4834]: I1008 22:49:11.360594 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwjwl" event={"ID":"92b204b8-6f1e-4965-a656-0b7ffbdf03ec","Type":"ContainerStarted","Data":"eb216c2e759339cddae9876fff2694a08280a93ea8c7da9573a817a3d137b2df"} Oct 08 22:49:12 crc kubenswrapper[4834]: I1008 22:49:12.369933 4834 generic.go:334] "Generic (PLEG): container finished" podID="92b204b8-6f1e-4965-a656-0b7ffbdf03ec" containerID="eb216c2e759339cddae9876fff2694a08280a93ea8c7da9573a817a3d137b2df" exitCode=0 Oct 08 22:49:12 crc kubenswrapper[4834]: I1008 22:49:12.370013 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwjwl" event={"ID":"92b204b8-6f1e-4965-a656-0b7ffbdf03ec","Type":"ContainerDied","Data":"eb216c2e759339cddae9876fff2694a08280a93ea8c7da9573a817a3d137b2df"} Oct 08 22:49:13 crc kubenswrapper[4834]: I1008 22:49:13.381896 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwjwl" event={"ID":"92b204b8-6f1e-4965-a656-0b7ffbdf03ec","Type":"ContainerStarted","Data":"8a7940dee8ef94ee5a926d20c97cc75d4ee668cc523dd883381a59c128998490"} Oct 08 22:49:13 crc kubenswrapper[4834]: I1008 22:49:13.405874 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rwjwl" podStartSLOduration=1.963837995 podStartE2EDuration="4.405848664s" podCreationTimestamp="2025-10-08 22:49:09 +0000 UTC" firstStartedPulling="2025-10-08 22:49:10.350817044 +0000 UTC m=+1558.173701810" lastFinishedPulling="2025-10-08 22:49:12.792827723 +0000 UTC m=+1560.615712479" observedRunningTime="2025-10-08 22:49:13.396396083 +0000 UTC m=+1561.219280839" watchObservedRunningTime="2025-10-08 22:49:13.405848664 +0000 UTC m=+1561.228733420" Oct 08 22:49:17 crc kubenswrapper[4834]: I1008 22:49:17.025930 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:49:17 crc kubenswrapper[4834]: I1008 22:49:17.026661 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:49:17 crc kubenswrapper[4834]: I1008 22:49:17.026733 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:49:17 crc kubenswrapper[4834]: I1008 22:49:17.027805 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:49:17 crc kubenswrapper[4834]: I1008 22:49:17.027920 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" gracePeriod=600 Oct 08 22:49:17 crc kubenswrapper[4834]: E1008 22:49:17.289577 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:49:17 crc kubenswrapper[4834]: I1008 22:49:17.420747 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" exitCode=0 Oct 08 22:49:17 crc kubenswrapper[4834]: I1008 22:49:17.420796 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f"} Oct 08 22:49:17 crc kubenswrapper[4834]: I1008 22:49:17.420832 4834 scope.go:117] "RemoveContainer" containerID="6163dc66da07cee67fb6457237db1afec09f5bef4082ecfbaefff77cd8dc028c" Oct 08 22:49:17 crc kubenswrapper[4834]: I1008 22:49:17.421516 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:49:17 crc kubenswrapper[4834]: E1008 22:49:17.422090 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:49:19 crc kubenswrapper[4834]: I1008 22:49:19.375257 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:19 crc kubenswrapper[4834]: I1008 22:49:19.375641 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:19 crc kubenswrapper[4834]: I1008 22:49:19.436916 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:19 crc kubenswrapper[4834]: I1008 22:49:19.495044 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:19 crc kubenswrapper[4834]: I1008 22:49:19.688798 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rwjwl"] Oct 08 22:49:20 crc kubenswrapper[4834]: I1008 22:49:20.485763 4834 scope.go:117] "RemoveContainer" containerID="2aa265b22953ecc5556abee66523e5673bc96ed54d9755bb598e92161a98743d" Oct 08 22:49:20 crc kubenswrapper[4834]: I1008 22:49:20.542776 4834 scope.go:117] "RemoveContainer" containerID="43fb96c167efdaaa1299cb06ffba0aaaeaff5f3e9c0bac460dec7ea52e491fdd" Oct 08 22:49:21 crc kubenswrapper[4834]: I1008 22:49:21.460885 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rwjwl" podUID="92b204b8-6f1e-4965-a656-0b7ffbdf03ec" containerName="registry-server" containerID="cri-o://8a7940dee8ef94ee5a926d20c97cc75d4ee668cc523dd883381a59c128998490" gracePeriod=2 Oct 08 22:49:22 crc kubenswrapper[4834]: I1008 22:49:22.480863 4834 generic.go:334] "Generic (PLEG): container finished" podID="92b204b8-6f1e-4965-a656-0b7ffbdf03ec" containerID="8a7940dee8ef94ee5a926d20c97cc75d4ee668cc523dd883381a59c128998490" exitCode=0 Oct 08 22:49:22 crc kubenswrapper[4834]: I1008 22:49:22.480920 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwjwl" event={"ID":"92b204b8-6f1e-4965-a656-0b7ffbdf03ec","Type":"ContainerDied","Data":"8a7940dee8ef94ee5a926d20c97cc75d4ee668cc523dd883381a59c128998490"} Oct 08 22:49:22 crc kubenswrapper[4834]: I1008 22:49:22.663995 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:22 crc kubenswrapper[4834]: I1008 22:49:22.781420 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-utilities\") pod \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\" (UID: \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\") " Oct 08 22:49:22 crc kubenswrapper[4834]: I1008 22:49:22.781489 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvmsr\" (UniqueName: \"kubernetes.io/projected/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-kube-api-access-tvmsr\") pod \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\" (UID: \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\") " Oct 08 22:49:22 crc kubenswrapper[4834]: I1008 22:49:22.781566 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-catalog-content\") pod \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\" (UID: \"92b204b8-6f1e-4965-a656-0b7ffbdf03ec\") " Oct 08 22:49:22 crc kubenswrapper[4834]: I1008 22:49:22.784452 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-utilities" (OuterVolumeSpecName: "utilities") pod "92b204b8-6f1e-4965-a656-0b7ffbdf03ec" (UID: "92b204b8-6f1e-4965-a656-0b7ffbdf03ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:49:22 crc kubenswrapper[4834]: I1008 22:49:22.787821 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-kube-api-access-tvmsr" (OuterVolumeSpecName: "kube-api-access-tvmsr") pod "92b204b8-6f1e-4965-a656-0b7ffbdf03ec" (UID: "92b204b8-6f1e-4965-a656-0b7ffbdf03ec"). InnerVolumeSpecName "kube-api-access-tvmsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:49:22 crc kubenswrapper[4834]: I1008 22:49:22.849942 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92b204b8-6f1e-4965-a656-0b7ffbdf03ec" (UID: "92b204b8-6f1e-4965-a656-0b7ffbdf03ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:49:22 crc kubenswrapper[4834]: I1008 22:49:22.883164 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:49:22 crc kubenswrapper[4834]: I1008 22:49:22.883198 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvmsr\" (UniqueName: \"kubernetes.io/projected/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-kube-api-access-tvmsr\") on node \"crc\" DevicePath \"\"" Oct 08 22:49:22 crc kubenswrapper[4834]: I1008 22:49:22.883208 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92b204b8-6f1e-4965-a656-0b7ffbdf03ec-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:49:23 crc kubenswrapper[4834]: I1008 22:49:23.499684 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwjwl" event={"ID":"92b204b8-6f1e-4965-a656-0b7ffbdf03ec","Type":"ContainerDied","Data":"aab85b9131afa4e7b7dfff4e7a0e81b1089340e29b812c071a3ce9409514467a"} Oct 08 22:49:23 crc kubenswrapper[4834]: I1008 22:49:23.499893 4834 scope.go:117] "RemoveContainer" containerID="8a7940dee8ef94ee5a926d20c97cc75d4ee668cc523dd883381a59c128998490" Oct 08 22:49:23 crc kubenswrapper[4834]: I1008 22:49:23.499931 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwjwl" Oct 08 22:49:23 crc kubenswrapper[4834]: I1008 22:49:23.520648 4834 scope.go:117] "RemoveContainer" containerID="eb216c2e759339cddae9876fff2694a08280a93ea8c7da9573a817a3d137b2df" Oct 08 22:49:23 crc kubenswrapper[4834]: I1008 22:49:23.563097 4834 scope.go:117] "RemoveContainer" containerID="54f90fa1a18c048d499fdace93ae048415aff712c3ab36a735ef1a8276d61ced" Oct 08 22:49:23 crc kubenswrapper[4834]: I1008 22:49:23.573630 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rwjwl"] Oct 08 22:49:23 crc kubenswrapper[4834]: I1008 22:49:23.584342 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rwjwl"] Oct 08 22:49:25 crc kubenswrapper[4834]: I1008 22:49:25.572571 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92b204b8-6f1e-4965-a656-0b7ffbdf03ec" path="/var/lib/kubelet/pods/92b204b8-6f1e-4965-a656-0b7ffbdf03ec/volumes" Oct 08 22:49:29 crc kubenswrapper[4834]: I1008 22:49:29.556502 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:49:29 crc kubenswrapper[4834]: E1008 22:49:29.557188 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:49:40 crc kubenswrapper[4834]: I1008 22:49:40.555290 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:49:40 crc kubenswrapper[4834]: E1008 22:49:40.555974 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:49:53 crc kubenswrapper[4834]: I1008 22:49:53.564278 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:49:53 crc kubenswrapper[4834]: E1008 22:49:53.565686 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:50:08 crc kubenswrapper[4834]: I1008 22:50:08.555846 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:50:08 crc kubenswrapper[4834]: E1008 22:50:08.557089 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:50:20 crc kubenswrapper[4834]: I1008 22:50:20.604374 4834 scope.go:117] "RemoveContainer" containerID="00a28d17edcc7b5ed388c8fc1b886f0caf34114f3bd6fb165a2bcc569450f50d" Oct 08 22:50:20 crc kubenswrapper[4834]: I1008 22:50:20.662314 4834 scope.go:117] "RemoveContainer" containerID="8178bb5c63d59751fa04e9a8611028105cb6a3f042c30ae51e01c44207c48306" Oct 08 22:50:20 crc kubenswrapper[4834]: I1008 22:50:20.693216 4834 scope.go:117] "RemoveContainer" containerID="99327c7137c459ab6c1a2243cfe4b0f0f55a62c5f12e27cb1eae90bc24efe0be" Oct 08 22:50:20 crc kubenswrapper[4834]: I1008 22:50:20.717425 4834 scope.go:117] "RemoveContainer" containerID="4aa0ba726ca1aa64b7eeea2ec8352bfcf0ba2a169332fa4c550d65f578ecb4f3" Oct 08 22:50:20 crc kubenswrapper[4834]: I1008 22:50:20.744004 4834 scope.go:117] "RemoveContainer" containerID="57b633ed2d101f8136f8501326ad9126727b1b6fbd68324d56f1644808d3cdaf" Oct 08 22:50:20 crc kubenswrapper[4834]: I1008 22:50:20.800934 4834 scope.go:117] "RemoveContainer" containerID="62ac2f468ed7a8ca1ccfd138476149a4798728df3269c8a1e691fb31a153f440" Oct 08 22:50:20 crc kubenswrapper[4834]: I1008 22:50:20.823373 4834 scope.go:117] "RemoveContainer" containerID="fa460528d511bb58346a4686bdd10ae400531ea0b52fc947a827453536728a0c" Oct 08 22:50:22 crc kubenswrapper[4834]: I1008 22:50:22.556808 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:50:22 crc kubenswrapper[4834]: E1008 22:50:22.557305 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:50:35 crc kubenswrapper[4834]: I1008 22:50:35.555520 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:50:35 crc kubenswrapper[4834]: E1008 22:50:35.556711 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:50:46 crc kubenswrapper[4834]: I1008 22:50:46.555507 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:50:46 crc kubenswrapper[4834]: E1008 22:50:46.556494 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:51:00 crc kubenswrapper[4834]: I1008 22:51:00.556471 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:51:00 crc kubenswrapper[4834]: E1008 22:51:00.557483 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:51:13 crc kubenswrapper[4834]: I1008 22:51:13.563667 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:51:13 crc kubenswrapper[4834]: E1008 22:51:13.564554 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:51:20 crc kubenswrapper[4834]: I1008 22:51:20.944944 4834 scope.go:117] "RemoveContainer" containerID="17651b84fca04f45f755cc82539e7ab66b0bf27627a435857700c2da43a23544" Oct 08 22:51:20 crc kubenswrapper[4834]: I1008 22:51:20.979965 4834 scope.go:117] "RemoveContainer" containerID="2c075f9f313e798e44f653a417d4b03da76fd6a78630718227b5b296a78fcde3" Oct 08 22:51:21 crc kubenswrapper[4834]: I1008 22:51:21.020168 4834 scope.go:117] "RemoveContainer" containerID="2374fa95bd5da7dd8560e90d3040135b0604e7518dabea482807d6d2761dc576" Oct 08 22:51:21 crc kubenswrapper[4834]: I1008 22:51:21.102411 4834 scope.go:117] "RemoveContainer" containerID="056ba376def004f5c203b607fe214be5d15dfb07d6766619f892b183abcc6853" Oct 08 22:51:21 crc kubenswrapper[4834]: I1008 22:51:21.161189 4834 scope.go:117] "RemoveContainer" containerID="ce5764d2183a5398012423761f4ca8e1f6b29ff026fd012d325298c4b1b7f24d" Oct 08 22:51:21 crc kubenswrapper[4834]: I1008 22:51:21.208882 4834 scope.go:117] "RemoveContainer" containerID="28b6df0ac931544576daa108f0b9d18f976008b14b16d6e24e5fafe553822bd9" Oct 08 22:51:21 crc kubenswrapper[4834]: I1008 22:51:21.245375 4834 scope.go:117] "RemoveContainer" containerID="67cba6b2e6fb5760abc4a1fed36847d10ef3d9623587e8d7c3c492e7f0b8bcbb" Oct 08 22:51:21 crc kubenswrapper[4834]: I1008 22:51:21.274109 4834 scope.go:117] "RemoveContainer" containerID="d26dc2b24f75c6416e3da5affb9ed2e0641dc5567fec5b856133b898272689ca" Oct 08 22:51:26 crc kubenswrapper[4834]: I1008 22:51:26.555020 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:51:26 crc kubenswrapper[4834]: E1008 22:51:26.555636 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:51:38 crc kubenswrapper[4834]: I1008 22:51:38.555707 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:51:38 crc kubenswrapper[4834]: E1008 22:51:38.556965 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:51:49 crc kubenswrapper[4834]: I1008 22:51:49.556134 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:51:49 crc kubenswrapper[4834]: E1008 22:51:49.557113 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:52:01 crc kubenswrapper[4834]: I1008 22:52:01.557285 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:52:01 crc kubenswrapper[4834]: E1008 22:52:01.557956 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:52:12 crc kubenswrapper[4834]: I1008 22:52:12.555241 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:52:12 crc kubenswrapper[4834]: E1008 22:52:12.556118 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:52:21 crc kubenswrapper[4834]: I1008 22:52:21.443822 4834 scope.go:117] "RemoveContainer" containerID="d772ec43c319ed37efddc72f55b2cad49aa8ee5fa72993e5c58323b2ce4d6343" Oct 08 22:52:21 crc kubenswrapper[4834]: I1008 22:52:21.497685 4834 scope.go:117] "RemoveContainer" containerID="ffff8b102cbea74492faff4cfca190c1120b7bf0f34305db2f2ebb11b62d6d87" Oct 08 22:52:21 crc kubenswrapper[4834]: I1008 22:52:21.532772 4834 scope.go:117] "RemoveContainer" containerID="1229c847ebd47dc74a7d80badde533a7cbbcc001fda7ef8707bb039596a86a5d" Oct 08 22:52:27 crc kubenswrapper[4834]: I1008 22:52:27.556133 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:52:27 crc kubenswrapper[4834]: E1008 22:52:27.557178 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:52:40 crc kubenswrapper[4834]: I1008 22:52:40.556188 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:52:40 crc kubenswrapper[4834]: E1008 22:52:40.557111 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:52:51 crc kubenswrapper[4834]: I1008 22:52:51.556036 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:52:51 crc kubenswrapper[4834]: E1008 22:52:51.557307 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:53:03 crc kubenswrapper[4834]: I1008 22:53:03.565867 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:53:03 crc kubenswrapper[4834]: E1008 22:53:03.567226 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:53:18 crc kubenswrapper[4834]: I1008 22:53:18.555516 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:53:18 crc kubenswrapper[4834]: E1008 22:53:18.556183 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:53:21 crc kubenswrapper[4834]: I1008 22:53:21.608182 4834 scope.go:117] "RemoveContainer" containerID="1e8ddb85f75ede5ff35a353d39c39d0175ca89fe8ddc6c8c24ac6790a0da17e1" Oct 08 22:53:21 crc kubenswrapper[4834]: I1008 22:53:21.645960 4834 scope.go:117] "RemoveContainer" containerID="1fd094e821088e300fb522feee24e3f5c1422f0a0b776c6baafc3a9ef2e8e60d" Oct 08 22:53:33 crc kubenswrapper[4834]: I1008 22:53:33.558760 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:53:33 crc kubenswrapper[4834]: E1008 22:53:33.560414 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:53:47 crc kubenswrapper[4834]: I1008 22:53:47.557677 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:53:47 crc kubenswrapper[4834]: E1008 22:53:47.558651 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:54:01 crc kubenswrapper[4834]: I1008 22:54:01.556302 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:54:01 crc kubenswrapper[4834]: E1008 22:54:01.557431 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:54:14 crc kubenswrapper[4834]: I1008 22:54:14.556321 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:54:14 crc kubenswrapper[4834]: E1008 22:54:14.557006 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 22:54:29 crc kubenswrapper[4834]: I1008 22:54:29.557208 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:54:30 crc kubenswrapper[4834]: I1008 22:54:30.401340 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"ab28162cf4c7e6fb4dcd8017cbd69eee9eeb5c16c85603fb0a65c070f9b16210"} Oct 08 22:56:47 crc kubenswrapper[4834]: I1008 22:56:47.026672 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:56:47 crc kubenswrapper[4834]: I1008 22:56:47.027754 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:57:17 crc kubenswrapper[4834]: I1008 22:57:17.026004 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:57:17 crc kubenswrapper[4834]: I1008 22:57:17.026796 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:57:47 crc kubenswrapper[4834]: I1008 22:57:47.026500 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:57:47 crc kubenswrapper[4834]: I1008 22:57:47.027341 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:57:47 crc kubenswrapper[4834]: I1008 22:57:47.027411 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 22:57:47 crc kubenswrapper[4834]: I1008 22:57:47.028447 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab28162cf4c7e6fb4dcd8017cbd69eee9eeb5c16c85603fb0a65c070f9b16210"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 22:57:47 crc kubenswrapper[4834]: I1008 22:57:47.028544 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://ab28162cf4c7e6fb4dcd8017cbd69eee9eeb5c16c85603fb0a65c070f9b16210" gracePeriod=600 Oct 08 22:57:48 crc kubenswrapper[4834]: I1008 22:57:48.152323 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="ab28162cf4c7e6fb4dcd8017cbd69eee9eeb5c16c85603fb0a65c070f9b16210" exitCode=0 Oct 08 22:57:48 crc kubenswrapper[4834]: I1008 22:57:48.152371 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"ab28162cf4c7e6fb4dcd8017cbd69eee9eeb5c16c85603fb0a65c070f9b16210"} Oct 08 22:57:48 crc kubenswrapper[4834]: I1008 22:57:48.152616 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c"} Oct 08 22:57:48 crc kubenswrapper[4834]: I1008 22:57:48.152641 4834 scope.go:117] "RemoveContainer" containerID="7a87d87ae894a1f065e00350d485d3b4dc6744f2636ae812d07582eee843ca9f" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.182698 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zjzz8"] Oct 08 22:58:07 crc kubenswrapper[4834]: E1008 22:58:07.184189 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b204b8-6f1e-4965-a656-0b7ffbdf03ec" containerName="extract-utilities" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.184207 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b204b8-6f1e-4965-a656-0b7ffbdf03ec" containerName="extract-utilities" Oct 08 22:58:07 crc kubenswrapper[4834]: E1008 22:58:07.184225 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b204b8-6f1e-4965-a656-0b7ffbdf03ec" containerName="extract-content" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.184232 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b204b8-6f1e-4965-a656-0b7ffbdf03ec" containerName="extract-content" Oct 08 22:58:07 crc kubenswrapper[4834]: E1008 22:58:07.184245 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92b204b8-6f1e-4965-a656-0b7ffbdf03ec" containerName="registry-server" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.184251 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="92b204b8-6f1e-4965-a656-0b7ffbdf03ec" containerName="registry-server" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.184424 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="92b204b8-6f1e-4965-a656-0b7ffbdf03ec" containerName="registry-server" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.185824 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.198602 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zjzz8"] Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.231206 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr5mg\" (UniqueName: \"kubernetes.io/projected/472445c7-ef02-4e1a-a709-1011d1fbcf25-kube-api-access-gr5mg\") pod \"certified-operators-zjzz8\" (UID: \"472445c7-ef02-4e1a-a709-1011d1fbcf25\") " pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.231578 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472445c7-ef02-4e1a-a709-1011d1fbcf25-catalog-content\") pod \"certified-operators-zjzz8\" (UID: \"472445c7-ef02-4e1a-a709-1011d1fbcf25\") " pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.231664 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472445c7-ef02-4e1a-a709-1011d1fbcf25-utilities\") pod \"certified-operators-zjzz8\" (UID: \"472445c7-ef02-4e1a-a709-1011d1fbcf25\") " pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.333099 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472445c7-ef02-4e1a-a709-1011d1fbcf25-utilities\") pod \"certified-operators-zjzz8\" (UID: \"472445c7-ef02-4e1a-a709-1011d1fbcf25\") " pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.333168 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr5mg\" (UniqueName: \"kubernetes.io/projected/472445c7-ef02-4e1a-a709-1011d1fbcf25-kube-api-access-gr5mg\") pod \"certified-operators-zjzz8\" (UID: \"472445c7-ef02-4e1a-a709-1011d1fbcf25\") " pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.333209 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472445c7-ef02-4e1a-a709-1011d1fbcf25-catalog-content\") pod \"certified-operators-zjzz8\" (UID: \"472445c7-ef02-4e1a-a709-1011d1fbcf25\") " pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.333724 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472445c7-ef02-4e1a-a709-1011d1fbcf25-catalog-content\") pod \"certified-operators-zjzz8\" (UID: \"472445c7-ef02-4e1a-a709-1011d1fbcf25\") " pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.333895 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472445c7-ef02-4e1a-a709-1011d1fbcf25-utilities\") pod \"certified-operators-zjzz8\" (UID: \"472445c7-ef02-4e1a-a709-1011d1fbcf25\") " pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.356026 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr5mg\" (UniqueName: \"kubernetes.io/projected/472445c7-ef02-4e1a-a709-1011d1fbcf25-kube-api-access-gr5mg\") pod \"certified-operators-zjzz8\" (UID: \"472445c7-ef02-4e1a-a709-1011d1fbcf25\") " pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.512884 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:07 crc kubenswrapper[4834]: I1008 22:58:07.976690 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zjzz8"] Oct 08 22:58:08 crc kubenswrapper[4834]: I1008 22:58:08.328232 4834 generic.go:334] "Generic (PLEG): container finished" podID="472445c7-ef02-4e1a-a709-1011d1fbcf25" containerID="0943baf202a1148c9d6b3bade662568487276ca785c043480d17a459e5c8a1ce" exitCode=0 Oct 08 22:58:08 crc kubenswrapper[4834]: I1008 22:58:08.328316 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjzz8" event={"ID":"472445c7-ef02-4e1a-a709-1011d1fbcf25","Type":"ContainerDied","Data":"0943baf202a1148c9d6b3bade662568487276ca785c043480d17a459e5c8a1ce"} Oct 08 22:58:08 crc kubenswrapper[4834]: I1008 22:58:08.328396 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjzz8" event={"ID":"472445c7-ef02-4e1a-a709-1011d1fbcf25","Type":"ContainerStarted","Data":"9a3039c4791cdf4281044f1bc895a0e8046186a62b5e9b32043085ccc2e44759"} Oct 08 22:58:08 crc kubenswrapper[4834]: I1008 22:58:08.330776 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 22:58:09 crc kubenswrapper[4834]: I1008 22:58:09.338565 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjzz8" event={"ID":"472445c7-ef02-4e1a-a709-1011d1fbcf25","Type":"ContainerStarted","Data":"63c581e4f127ef32b2281d2071c263c44f67652723c8ad50d13213d273890d1b"} Oct 08 22:58:10 crc kubenswrapper[4834]: I1008 22:58:10.349436 4834 generic.go:334] "Generic (PLEG): container finished" podID="472445c7-ef02-4e1a-a709-1011d1fbcf25" containerID="63c581e4f127ef32b2281d2071c263c44f67652723c8ad50d13213d273890d1b" exitCode=0 Oct 08 22:58:10 crc kubenswrapper[4834]: I1008 22:58:10.349483 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjzz8" event={"ID":"472445c7-ef02-4e1a-a709-1011d1fbcf25","Type":"ContainerDied","Data":"63c581e4f127ef32b2281d2071c263c44f67652723c8ad50d13213d273890d1b"} Oct 08 22:58:11 crc kubenswrapper[4834]: I1008 22:58:11.360565 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjzz8" event={"ID":"472445c7-ef02-4e1a-a709-1011d1fbcf25","Type":"ContainerStarted","Data":"24e1956ab24ce5e33c4c360600331362181bdff3e88de0aa069bba367a4dd228"} Oct 08 22:58:11 crc kubenswrapper[4834]: I1008 22:58:11.380235 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zjzz8" podStartSLOduration=1.935394292 podStartE2EDuration="4.380217185s" podCreationTimestamp="2025-10-08 22:58:07 +0000 UTC" firstStartedPulling="2025-10-08 22:58:08.330462355 +0000 UTC m=+2096.153347101" lastFinishedPulling="2025-10-08 22:58:10.775285218 +0000 UTC m=+2098.598169994" observedRunningTime="2025-10-08 22:58:11.377264813 +0000 UTC m=+2099.200149609" watchObservedRunningTime="2025-10-08 22:58:11.380217185 +0000 UTC m=+2099.203101931" Oct 08 22:58:17 crc kubenswrapper[4834]: I1008 22:58:17.514817 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:17 crc kubenswrapper[4834]: I1008 22:58:17.515547 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:17 crc kubenswrapper[4834]: I1008 22:58:17.574871 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:18 crc kubenswrapper[4834]: I1008 22:58:18.500624 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:18 crc kubenswrapper[4834]: I1008 22:58:18.556934 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zjzz8"] Oct 08 22:58:20 crc kubenswrapper[4834]: I1008 22:58:20.462458 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zjzz8" podUID="472445c7-ef02-4e1a-a709-1011d1fbcf25" containerName="registry-server" containerID="cri-o://24e1956ab24ce5e33c4c360600331362181bdff3e88de0aa069bba367a4dd228" gracePeriod=2 Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.346834 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.351528 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr5mg\" (UniqueName: \"kubernetes.io/projected/472445c7-ef02-4e1a-a709-1011d1fbcf25-kube-api-access-gr5mg\") pod \"472445c7-ef02-4e1a-a709-1011d1fbcf25\" (UID: \"472445c7-ef02-4e1a-a709-1011d1fbcf25\") " Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.351603 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472445c7-ef02-4e1a-a709-1011d1fbcf25-utilities\") pod \"472445c7-ef02-4e1a-a709-1011d1fbcf25\" (UID: \"472445c7-ef02-4e1a-a709-1011d1fbcf25\") " Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.351642 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472445c7-ef02-4e1a-a709-1011d1fbcf25-catalog-content\") pod \"472445c7-ef02-4e1a-a709-1011d1fbcf25\" (UID: \"472445c7-ef02-4e1a-a709-1011d1fbcf25\") " Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.352854 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/472445c7-ef02-4e1a-a709-1011d1fbcf25-utilities" (OuterVolumeSpecName: "utilities") pod "472445c7-ef02-4e1a-a709-1011d1fbcf25" (UID: "472445c7-ef02-4e1a-a709-1011d1fbcf25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.357454 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472445c7-ef02-4e1a-a709-1011d1fbcf25-kube-api-access-gr5mg" (OuterVolumeSpecName: "kube-api-access-gr5mg") pod "472445c7-ef02-4e1a-a709-1011d1fbcf25" (UID: "472445c7-ef02-4e1a-a709-1011d1fbcf25"). InnerVolumeSpecName "kube-api-access-gr5mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.411315 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/472445c7-ef02-4e1a-a709-1011d1fbcf25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "472445c7-ef02-4e1a-a709-1011d1fbcf25" (UID: "472445c7-ef02-4e1a-a709-1011d1fbcf25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.452657 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr5mg\" (UniqueName: \"kubernetes.io/projected/472445c7-ef02-4e1a-a709-1011d1fbcf25-kube-api-access-gr5mg\") on node \"crc\" DevicePath \"\"" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.452685 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/472445c7-ef02-4e1a-a709-1011d1fbcf25-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.452694 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/472445c7-ef02-4e1a-a709-1011d1fbcf25-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.472586 4834 generic.go:334] "Generic (PLEG): container finished" podID="472445c7-ef02-4e1a-a709-1011d1fbcf25" containerID="24e1956ab24ce5e33c4c360600331362181bdff3e88de0aa069bba367a4dd228" exitCode=0 Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.472629 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjzz8" event={"ID":"472445c7-ef02-4e1a-a709-1011d1fbcf25","Type":"ContainerDied","Data":"24e1956ab24ce5e33c4c360600331362181bdff3e88de0aa069bba367a4dd228"} Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.472672 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zjzz8" event={"ID":"472445c7-ef02-4e1a-a709-1011d1fbcf25","Type":"ContainerDied","Data":"9a3039c4791cdf4281044f1bc895a0e8046186a62b5e9b32043085ccc2e44759"} Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.472686 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zjzz8" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.472688 4834 scope.go:117] "RemoveContainer" containerID="24e1956ab24ce5e33c4c360600331362181bdff3e88de0aa069bba367a4dd228" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.501379 4834 scope.go:117] "RemoveContainer" containerID="63c581e4f127ef32b2281d2071c263c44f67652723c8ad50d13213d273890d1b" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.520809 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zjzz8"] Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.528236 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zjzz8"] Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.549603 4834 scope.go:117] "RemoveContainer" containerID="0943baf202a1148c9d6b3bade662568487276ca785c043480d17a459e5c8a1ce" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.564271 4834 scope.go:117] "RemoveContainer" containerID="24e1956ab24ce5e33c4c360600331362181bdff3e88de0aa069bba367a4dd228" Oct 08 22:58:21 crc kubenswrapper[4834]: E1008 22:58:21.564584 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24e1956ab24ce5e33c4c360600331362181bdff3e88de0aa069bba367a4dd228\": container with ID starting with 24e1956ab24ce5e33c4c360600331362181bdff3e88de0aa069bba367a4dd228 not found: ID does not exist" containerID="24e1956ab24ce5e33c4c360600331362181bdff3e88de0aa069bba367a4dd228" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.564626 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e1956ab24ce5e33c4c360600331362181bdff3e88de0aa069bba367a4dd228"} err="failed to get container status \"24e1956ab24ce5e33c4c360600331362181bdff3e88de0aa069bba367a4dd228\": rpc error: code = NotFound desc = could not find container \"24e1956ab24ce5e33c4c360600331362181bdff3e88de0aa069bba367a4dd228\": container with ID starting with 24e1956ab24ce5e33c4c360600331362181bdff3e88de0aa069bba367a4dd228 not found: ID does not exist" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.564656 4834 scope.go:117] "RemoveContainer" containerID="63c581e4f127ef32b2281d2071c263c44f67652723c8ad50d13213d273890d1b" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.564729 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="472445c7-ef02-4e1a-a709-1011d1fbcf25" path="/var/lib/kubelet/pods/472445c7-ef02-4e1a-a709-1011d1fbcf25/volumes" Oct 08 22:58:21 crc kubenswrapper[4834]: E1008 22:58:21.565069 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63c581e4f127ef32b2281d2071c263c44f67652723c8ad50d13213d273890d1b\": container with ID starting with 63c581e4f127ef32b2281d2071c263c44f67652723c8ad50d13213d273890d1b not found: ID does not exist" containerID="63c581e4f127ef32b2281d2071c263c44f67652723c8ad50d13213d273890d1b" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.565101 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63c581e4f127ef32b2281d2071c263c44f67652723c8ad50d13213d273890d1b"} err="failed to get container status \"63c581e4f127ef32b2281d2071c263c44f67652723c8ad50d13213d273890d1b\": rpc error: code = NotFound desc = could not find container \"63c581e4f127ef32b2281d2071c263c44f67652723c8ad50d13213d273890d1b\": container with ID starting with 63c581e4f127ef32b2281d2071c263c44f67652723c8ad50d13213d273890d1b not found: ID does not exist" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.565119 4834 scope.go:117] "RemoveContainer" containerID="0943baf202a1148c9d6b3bade662568487276ca785c043480d17a459e5c8a1ce" Oct 08 22:58:21 crc kubenswrapper[4834]: E1008 22:58:21.565657 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0943baf202a1148c9d6b3bade662568487276ca785c043480d17a459e5c8a1ce\": container with ID starting with 0943baf202a1148c9d6b3bade662568487276ca785c043480d17a459e5c8a1ce not found: ID does not exist" containerID="0943baf202a1148c9d6b3bade662568487276ca785c043480d17a459e5c8a1ce" Oct 08 22:58:21 crc kubenswrapper[4834]: I1008 22:58:21.565700 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0943baf202a1148c9d6b3bade662568487276ca785c043480d17a459e5c8a1ce"} err="failed to get container status \"0943baf202a1148c9d6b3bade662568487276ca785c043480d17a459e5c8a1ce\": rpc error: code = NotFound desc = could not find container \"0943baf202a1148c9d6b3bade662568487276ca785c043480d17a459e5c8a1ce\": container with ID starting with 0943baf202a1148c9d6b3bade662568487276ca785c043480d17a459e5c8a1ce not found: ID does not exist" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.189524 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rb6mq"] Oct 08 22:58:44 crc kubenswrapper[4834]: E1008 22:58:44.190394 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472445c7-ef02-4e1a-a709-1011d1fbcf25" containerName="extract-content" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.190410 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="472445c7-ef02-4e1a-a709-1011d1fbcf25" containerName="extract-content" Oct 08 22:58:44 crc kubenswrapper[4834]: E1008 22:58:44.190432 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472445c7-ef02-4e1a-a709-1011d1fbcf25" containerName="extract-utilities" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.190442 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="472445c7-ef02-4e1a-a709-1011d1fbcf25" containerName="extract-utilities" Oct 08 22:58:44 crc kubenswrapper[4834]: E1008 22:58:44.190464 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472445c7-ef02-4e1a-a709-1011d1fbcf25" containerName="registry-server" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.190472 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="472445c7-ef02-4e1a-a709-1011d1fbcf25" containerName="registry-server" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.190641 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="472445c7-ef02-4e1a-a709-1011d1fbcf25" containerName="registry-server" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.191944 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.211494 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rb6mq"] Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.315782 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94bafe17-a6f5-4f94-aefe-c58f61d405b1-catalog-content\") pod \"redhat-marketplace-rb6mq\" (UID: \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\") " pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.315862 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94bafe17-a6f5-4f94-aefe-c58f61d405b1-utilities\") pod \"redhat-marketplace-rb6mq\" (UID: \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\") " pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.315971 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxndh\" (UniqueName: \"kubernetes.io/projected/94bafe17-a6f5-4f94-aefe-c58f61d405b1-kube-api-access-gxndh\") pod \"redhat-marketplace-rb6mq\" (UID: \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\") " pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.417339 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94bafe17-a6f5-4f94-aefe-c58f61d405b1-utilities\") pod \"redhat-marketplace-rb6mq\" (UID: \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\") " pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.417432 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxndh\" (UniqueName: \"kubernetes.io/projected/94bafe17-a6f5-4f94-aefe-c58f61d405b1-kube-api-access-gxndh\") pod \"redhat-marketplace-rb6mq\" (UID: \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\") " pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.417595 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94bafe17-a6f5-4f94-aefe-c58f61d405b1-catalog-content\") pod \"redhat-marketplace-rb6mq\" (UID: \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\") " pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.418039 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94bafe17-a6f5-4f94-aefe-c58f61d405b1-catalog-content\") pod \"redhat-marketplace-rb6mq\" (UID: \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\") " pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.418303 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94bafe17-a6f5-4f94-aefe-c58f61d405b1-utilities\") pod \"redhat-marketplace-rb6mq\" (UID: \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\") " pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.440315 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxndh\" (UniqueName: \"kubernetes.io/projected/94bafe17-a6f5-4f94-aefe-c58f61d405b1-kube-api-access-gxndh\") pod \"redhat-marketplace-rb6mq\" (UID: \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\") " pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.523255 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.781588 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2d6w6"] Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.783423 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.798044 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2d6w6"] Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.924218 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-catalog-content\") pod \"redhat-operators-2d6w6\" (UID: \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\") " pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.924269 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch6zv\" (UniqueName: \"kubernetes.io/projected/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-kube-api-access-ch6zv\") pod \"redhat-operators-2d6w6\" (UID: \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\") " pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:44 crc kubenswrapper[4834]: I1008 22:58:44.924578 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-utilities\") pod \"redhat-operators-2d6w6\" (UID: \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\") " pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:45 crc kubenswrapper[4834]: I1008 22:58:45.013440 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rb6mq"] Oct 08 22:58:45 crc kubenswrapper[4834]: I1008 22:58:45.025991 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-catalog-content\") pod \"redhat-operators-2d6w6\" (UID: \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\") " pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:45 crc kubenswrapper[4834]: I1008 22:58:45.026060 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch6zv\" (UniqueName: \"kubernetes.io/projected/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-kube-api-access-ch6zv\") pod \"redhat-operators-2d6w6\" (UID: \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\") " pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:45 crc kubenswrapper[4834]: I1008 22:58:45.026188 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-utilities\") pod \"redhat-operators-2d6w6\" (UID: \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\") " pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:45 crc kubenswrapper[4834]: I1008 22:58:45.026688 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-catalog-content\") pod \"redhat-operators-2d6w6\" (UID: \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\") " pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:45 crc kubenswrapper[4834]: I1008 22:58:45.026714 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-utilities\") pod \"redhat-operators-2d6w6\" (UID: \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\") " pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:45 crc kubenswrapper[4834]: I1008 22:58:45.045582 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch6zv\" (UniqueName: \"kubernetes.io/projected/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-kube-api-access-ch6zv\") pod \"redhat-operators-2d6w6\" (UID: \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\") " pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:45 crc kubenswrapper[4834]: I1008 22:58:45.114833 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:45 crc kubenswrapper[4834]: I1008 22:58:45.564973 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2d6w6"] Oct 08 22:58:45 crc kubenswrapper[4834]: I1008 22:58:45.694875 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d6w6" event={"ID":"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435","Type":"ContainerStarted","Data":"4f8cac30245ee97c2e6ddeebe2f353137aa0a065c5a3ff8e1c7c1fc29ca18754"} Oct 08 22:58:45 crc kubenswrapper[4834]: I1008 22:58:45.696498 4834 generic.go:334] "Generic (PLEG): container finished" podID="94bafe17-a6f5-4f94-aefe-c58f61d405b1" containerID="84b1f9520f28694537d82e45fc15f030ccbd539a744b9329954e4f62fac7ce70" exitCode=0 Oct 08 22:58:45 crc kubenswrapper[4834]: I1008 22:58:45.696526 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rb6mq" event={"ID":"94bafe17-a6f5-4f94-aefe-c58f61d405b1","Type":"ContainerDied","Data":"84b1f9520f28694537d82e45fc15f030ccbd539a744b9329954e4f62fac7ce70"} Oct 08 22:58:45 crc kubenswrapper[4834]: I1008 22:58:45.696541 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rb6mq" event={"ID":"94bafe17-a6f5-4f94-aefe-c58f61d405b1","Type":"ContainerStarted","Data":"41a779d53a73a47398792518df705f8aa254c18d71ce128b0dc2c98ee1e1c79d"} Oct 08 22:58:46 crc kubenswrapper[4834]: I1008 22:58:46.710409 4834 generic.go:334] "Generic (PLEG): container finished" podID="8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" containerID="0651ed880b52315c504c522da92c7d6784d7b1e62485af2e4fc02a676bffe068" exitCode=0 Oct 08 22:58:46 crc kubenswrapper[4834]: I1008 22:58:46.710526 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d6w6" event={"ID":"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435","Type":"ContainerDied","Data":"0651ed880b52315c504c522da92c7d6784d7b1e62485af2e4fc02a676bffe068"} Oct 08 22:58:46 crc kubenswrapper[4834]: I1008 22:58:46.723877 4834 generic.go:334] "Generic (PLEG): container finished" podID="94bafe17-a6f5-4f94-aefe-c58f61d405b1" containerID="a2386e31e4e479805c4c6f7af4e2fb5b214804c887f1fc1dc096c6f4a721995c" exitCode=0 Oct 08 22:58:46 crc kubenswrapper[4834]: I1008 22:58:46.723935 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rb6mq" event={"ID":"94bafe17-a6f5-4f94-aefe-c58f61d405b1","Type":"ContainerDied","Data":"a2386e31e4e479805c4c6f7af4e2fb5b214804c887f1fc1dc096c6f4a721995c"} Oct 08 22:58:47 crc kubenswrapper[4834]: I1008 22:58:47.731925 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d6w6" event={"ID":"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435","Type":"ContainerStarted","Data":"8a28dfe2e94b2dd6c6586c0c7f470795acf419d1699f596c40bc40db32aebdf1"} Oct 08 22:58:47 crc kubenswrapper[4834]: I1008 22:58:47.737967 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rb6mq" event={"ID":"94bafe17-a6f5-4f94-aefe-c58f61d405b1","Type":"ContainerStarted","Data":"584deebc29649c86781cb018afc560d0bb3f08c52b9daea7949aad99bc9994a0"} Oct 08 22:58:47 crc kubenswrapper[4834]: I1008 22:58:47.772973 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rb6mq" podStartSLOduration=2.312953309 podStartE2EDuration="3.772954317s" podCreationTimestamp="2025-10-08 22:58:44 +0000 UTC" firstStartedPulling="2025-10-08 22:58:45.697649478 +0000 UTC m=+2133.520534224" lastFinishedPulling="2025-10-08 22:58:47.157650466 +0000 UTC m=+2134.980535232" observedRunningTime="2025-10-08 22:58:47.767306499 +0000 UTC m=+2135.590191245" watchObservedRunningTime="2025-10-08 22:58:47.772954317 +0000 UTC m=+2135.595839063" Oct 08 22:58:48 crc kubenswrapper[4834]: I1008 22:58:48.751529 4834 generic.go:334] "Generic (PLEG): container finished" podID="8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" containerID="8a28dfe2e94b2dd6c6586c0c7f470795acf419d1699f596c40bc40db32aebdf1" exitCode=0 Oct 08 22:58:48 crc kubenswrapper[4834]: I1008 22:58:48.753078 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d6w6" event={"ID":"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435","Type":"ContainerDied","Data":"8a28dfe2e94b2dd6c6586c0c7f470795acf419d1699f596c40bc40db32aebdf1"} Oct 08 22:58:49 crc kubenswrapper[4834]: I1008 22:58:49.763264 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d6w6" event={"ID":"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435","Type":"ContainerStarted","Data":"9412899f6287e7f3fc8295af6a66ca8590c0d19fba69012247c3a0d3ac194e0d"} Oct 08 22:58:49 crc kubenswrapper[4834]: I1008 22:58:49.792006 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2d6w6" podStartSLOduration=3.335479103 podStartE2EDuration="5.791988507s" podCreationTimestamp="2025-10-08 22:58:44 +0000 UTC" firstStartedPulling="2025-10-08 22:58:46.712755763 +0000 UTC m=+2134.535640549" lastFinishedPulling="2025-10-08 22:58:49.169265167 +0000 UTC m=+2136.992149953" observedRunningTime="2025-10-08 22:58:49.790868909 +0000 UTC m=+2137.613753685" watchObservedRunningTime="2025-10-08 22:58:49.791988507 +0000 UTC m=+2137.614873253" Oct 08 22:58:54 crc kubenswrapper[4834]: I1008 22:58:54.523875 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:54 crc kubenswrapper[4834]: I1008 22:58:54.524582 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:54 crc kubenswrapper[4834]: I1008 22:58:54.592298 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:54 crc kubenswrapper[4834]: I1008 22:58:54.844350 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:54 crc kubenswrapper[4834]: I1008 22:58:54.897751 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rb6mq"] Oct 08 22:58:55 crc kubenswrapper[4834]: I1008 22:58:55.115298 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:55 crc kubenswrapper[4834]: I1008 22:58:55.115391 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:55 crc kubenswrapper[4834]: I1008 22:58:55.190974 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:55 crc kubenswrapper[4834]: I1008 22:58:55.882302 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:56 crc kubenswrapper[4834]: I1008 22:58:56.824299 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rb6mq" podUID="94bafe17-a6f5-4f94-aefe-c58f61d405b1" containerName="registry-server" containerID="cri-o://584deebc29649c86781cb018afc560d0bb3f08c52b9daea7949aad99bc9994a0" gracePeriod=2 Oct 08 22:58:57 crc kubenswrapper[4834]: I1008 22:58:57.234621 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2d6w6"] Oct 08 22:58:57 crc kubenswrapper[4834]: I1008 22:58:57.834308 4834 generic.go:334] "Generic (PLEG): container finished" podID="94bafe17-a6f5-4f94-aefe-c58f61d405b1" containerID="584deebc29649c86781cb018afc560d0bb3f08c52b9daea7949aad99bc9994a0" exitCode=0 Oct 08 22:58:57 crc kubenswrapper[4834]: I1008 22:58:57.834836 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2d6w6" podUID="8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" containerName="registry-server" containerID="cri-o://9412899f6287e7f3fc8295af6a66ca8590c0d19fba69012247c3a0d3ac194e0d" gracePeriod=2 Oct 08 22:58:57 crc kubenswrapper[4834]: I1008 22:58:57.834501 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rb6mq" event={"ID":"94bafe17-a6f5-4f94-aefe-c58f61d405b1","Type":"ContainerDied","Data":"584deebc29649c86781cb018afc560d0bb3f08c52b9daea7949aad99bc9994a0"} Oct 08 22:58:57 crc kubenswrapper[4834]: I1008 22:58:57.834912 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rb6mq" event={"ID":"94bafe17-a6f5-4f94-aefe-c58f61d405b1","Type":"ContainerDied","Data":"41a779d53a73a47398792518df705f8aa254c18d71ce128b0dc2c98ee1e1c79d"} Oct 08 22:58:57 crc kubenswrapper[4834]: I1008 22:58:57.834923 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41a779d53a73a47398792518df705f8aa254c18d71ce128b0dc2c98ee1e1c79d" Oct 08 22:58:57 crc kubenswrapper[4834]: I1008 22:58:57.848128 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:57 crc kubenswrapper[4834]: I1008 22:58:57.971348 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94bafe17-a6f5-4f94-aefe-c58f61d405b1-catalog-content\") pod \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\" (UID: \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\") " Oct 08 22:58:57 crc kubenswrapper[4834]: I1008 22:58:57.971422 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxndh\" (UniqueName: \"kubernetes.io/projected/94bafe17-a6f5-4f94-aefe-c58f61d405b1-kube-api-access-gxndh\") pod \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\" (UID: \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\") " Oct 08 22:58:57 crc kubenswrapper[4834]: I1008 22:58:57.971491 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94bafe17-a6f5-4f94-aefe-c58f61d405b1-utilities\") pod \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\" (UID: \"94bafe17-a6f5-4f94-aefe-c58f61d405b1\") " Oct 08 22:58:57 crc kubenswrapper[4834]: I1008 22:58:57.972346 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94bafe17-a6f5-4f94-aefe-c58f61d405b1-utilities" (OuterVolumeSpecName: "utilities") pod "94bafe17-a6f5-4f94-aefe-c58f61d405b1" (UID: "94bafe17-a6f5-4f94-aefe-c58f61d405b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:58:57 crc kubenswrapper[4834]: I1008 22:58:57.979580 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94bafe17-a6f5-4f94-aefe-c58f61d405b1-kube-api-access-gxndh" (OuterVolumeSpecName: "kube-api-access-gxndh") pod "94bafe17-a6f5-4f94-aefe-c58f61d405b1" (UID: "94bafe17-a6f5-4f94-aefe-c58f61d405b1"). InnerVolumeSpecName "kube-api-access-gxndh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:58:57 crc kubenswrapper[4834]: I1008 22:58:57.989597 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94bafe17-a6f5-4f94-aefe-c58f61d405b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94bafe17-a6f5-4f94-aefe-c58f61d405b1" (UID: "94bafe17-a6f5-4f94-aefe-c58f61d405b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.073209 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94bafe17-a6f5-4f94-aefe-c58f61d405b1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.073244 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxndh\" (UniqueName: \"kubernetes.io/projected/94bafe17-a6f5-4f94-aefe-c58f61d405b1-kube-api-access-gxndh\") on node \"crc\" DevicePath \"\"" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.073256 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94bafe17-a6f5-4f94-aefe-c58f61d405b1-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.754832 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.847487 4834 generic.go:334] "Generic (PLEG): container finished" podID="8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" containerID="9412899f6287e7f3fc8295af6a66ca8590c0d19fba69012247c3a0d3ac194e0d" exitCode=0 Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.847543 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d6w6" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.847578 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rb6mq" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.847605 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d6w6" event={"ID":"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435","Type":"ContainerDied","Data":"9412899f6287e7f3fc8295af6a66ca8590c0d19fba69012247c3a0d3ac194e0d"} Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.847637 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d6w6" event={"ID":"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435","Type":"ContainerDied","Data":"4f8cac30245ee97c2e6ddeebe2f353137aa0a065c5a3ff8e1c7c1fc29ca18754"} Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.847655 4834 scope.go:117] "RemoveContainer" containerID="9412899f6287e7f3fc8295af6a66ca8590c0d19fba69012247c3a0d3ac194e0d" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.871536 4834 scope.go:117] "RemoveContainer" containerID="8a28dfe2e94b2dd6c6586c0c7f470795acf419d1699f596c40bc40db32aebdf1" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.883607 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-utilities\") pod \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\" (UID: \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\") " Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.883813 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch6zv\" (UniqueName: \"kubernetes.io/projected/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-kube-api-access-ch6zv\") pod \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\" (UID: \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\") " Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.883844 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-catalog-content\") pod \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\" (UID: \"8fac1f2f-9a2b-4e6b-aaee-1925f97a4435\") " Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.886100 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-utilities" (OuterVolumeSpecName: "utilities") pod "8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" (UID: "8fac1f2f-9a2b-4e6b-aaee-1925f97a4435"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.891005 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rb6mq"] Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.892637 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.896812 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rb6mq"] Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.909634 4834 scope.go:117] "RemoveContainer" containerID="0651ed880b52315c504c522da92c7d6784d7b1e62485af2e4fc02a676bffe068" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.916847 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-kube-api-access-ch6zv" (OuterVolumeSpecName: "kube-api-access-ch6zv") pod "8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" (UID: "8fac1f2f-9a2b-4e6b-aaee-1925f97a4435"). InnerVolumeSpecName "kube-api-access-ch6zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.929256 4834 scope.go:117] "RemoveContainer" containerID="9412899f6287e7f3fc8295af6a66ca8590c0d19fba69012247c3a0d3ac194e0d" Oct 08 22:58:58 crc kubenswrapper[4834]: E1008 22:58:58.929708 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9412899f6287e7f3fc8295af6a66ca8590c0d19fba69012247c3a0d3ac194e0d\": container with ID starting with 9412899f6287e7f3fc8295af6a66ca8590c0d19fba69012247c3a0d3ac194e0d not found: ID does not exist" containerID="9412899f6287e7f3fc8295af6a66ca8590c0d19fba69012247c3a0d3ac194e0d" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.929747 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9412899f6287e7f3fc8295af6a66ca8590c0d19fba69012247c3a0d3ac194e0d"} err="failed to get container status \"9412899f6287e7f3fc8295af6a66ca8590c0d19fba69012247c3a0d3ac194e0d\": rpc error: code = NotFound desc = could not find container \"9412899f6287e7f3fc8295af6a66ca8590c0d19fba69012247c3a0d3ac194e0d\": container with ID starting with 9412899f6287e7f3fc8295af6a66ca8590c0d19fba69012247c3a0d3ac194e0d not found: ID does not exist" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.929771 4834 scope.go:117] "RemoveContainer" containerID="8a28dfe2e94b2dd6c6586c0c7f470795acf419d1699f596c40bc40db32aebdf1" Oct 08 22:58:58 crc kubenswrapper[4834]: E1008 22:58:58.930184 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a28dfe2e94b2dd6c6586c0c7f470795acf419d1699f596c40bc40db32aebdf1\": container with ID starting with 8a28dfe2e94b2dd6c6586c0c7f470795acf419d1699f596c40bc40db32aebdf1 not found: ID does not exist" containerID="8a28dfe2e94b2dd6c6586c0c7f470795acf419d1699f596c40bc40db32aebdf1" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.930221 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a28dfe2e94b2dd6c6586c0c7f470795acf419d1699f596c40bc40db32aebdf1"} err="failed to get container status \"8a28dfe2e94b2dd6c6586c0c7f470795acf419d1699f596c40bc40db32aebdf1\": rpc error: code = NotFound desc = could not find container \"8a28dfe2e94b2dd6c6586c0c7f470795acf419d1699f596c40bc40db32aebdf1\": container with ID starting with 8a28dfe2e94b2dd6c6586c0c7f470795acf419d1699f596c40bc40db32aebdf1 not found: ID does not exist" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.930241 4834 scope.go:117] "RemoveContainer" containerID="0651ed880b52315c504c522da92c7d6784d7b1e62485af2e4fc02a676bffe068" Oct 08 22:58:58 crc kubenswrapper[4834]: E1008 22:58:58.930663 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0651ed880b52315c504c522da92c7d6784d7b1e62485af2e4fc02a676bffe068\": container with ID starting with 0651ed880b52315c504c522da92c7d6784d7b1e62485af2e4fc02a676bffe068 not found: ID does not exist" containerID="0651ed880b52315c504c522da92c7d6784d7b1e62485af2e4fc02a676bffe068" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.930774 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0651ed880b52315c504c522da92c7d6784d7b1e62485af2e4fc02a676bffe068"} err="failed to get container status \"0651ed880b52315c504c522da92c7d6784d7b1e62485af2e4fc02a676bffe068\": rpc error: code = NotFound desc = could not find container \"0651ed880b52315c504c522da92c7d6784d7b1e62485af2e4fc02a676bffe068\": container with ID starting with 0651ed880b52315c504c522da92c7d6784d7b1e62485af2e4fc02a676bffe068 not found: ID does not exist" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.974320 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" (UID: "8fac1f2f-9a2b-4e6b-aaee-1925f97a4435"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.994009 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch6zv\" (UniqueName: \"kubernetes.io/projected/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-kube-api-access-ch6zv\") on node \"crc\" DevicePath \"\"" Oct 08 22:58:58 crc kubenswrapper[4834]: I1008 22:58:58.994055 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:58:59 crc kubenswrapper[4834]: I1008 22:58:59.196986 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2d6w6"] Oct 08 22:58:59 crc kubenswrapper[4834]: I1008 22:58:59.204853 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2d6w6"] Oct 08 22:58:59 crc kubenswrapper[4834]: I1008 22:58:59.572698 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" path="/var/lib/kubelet/pods/8fac1f2f-9a2b-4e6b-aaee-1925f97a4435/volumes" Oct 08 22:58:59 crc kubenswrapper[4834]: I1008 22:58:59.574098 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94bafe17-a6f5-4f94-aefe-c58f61d405b1" path="/var/lib/kubelet/pods/94bafe17-a6f5-4f94-aefe-c58f61d405b1/volumes" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.109414 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6qclp"] Oct 08 22:59:38 crc kubenswrapper[4834]: E1008 22:59:38.110363 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94bafe17-a6f5-4f94-aefe-c58f61d405b1" containerName="extract-utilities" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.110380 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="94bafe17-a6f5-4f94-aefe-c58f61d405b1" containerName="extract-utilities" Oct 08 22:59:38 crc kubenswrapper[4834]: E1008 22:59:38.110403 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94bafe17-a6f5-4f94-aefe-c58f61d405b1" containerName="registry-server" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.110410 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="94bafe17-a6f5-4f94-aefe-c58f61d405b1" containerName="registry-server" Oct 08 22:59:38 crc kubenswrapper[4834]: E1008 22:59:38.110433 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" containerName="extract-content" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.110442 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" containerName="extract-content" Oct 08 22:59:38 crc kubenswrapper[4834]: E1008 22:59:38.110458 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94bafe17-a6f5-4f94-aefe-c58f61d405b1" containerName="extract-content" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.110465 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="94bafe17-a6f5-4f94-aefe-c58f61d405b1" containerName="extract-content" Oct 08 22:59:38 crc kubenswrapper[4834]: E1008 22:59:38.110475 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" containerName="registry-server" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.110481 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" containerName="registry-server" Oct 08 22:59:38 crc kubenswrapper[4834]: E1008 22:59:38.110497 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" containerName="extract-utilities" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.110504 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" containerName="extract-utilities" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.110933 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fac1f2f-9a2b-4e6b-aaee-1925f97a4435" containerName="registry-server" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.110956 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="94bafe17-a6f5-4f94-aefe-c58f61d405b1" containerName="registry-server" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.112888 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.126668 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6qclp"] Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.223611 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zt5s\" (UniqueName: \"kubernetes.io/projected/4828f2cf-8234-4289-aeae-65eb34fd3eee-kube-api-access-4zt5s\") pod \"community-operators-6qclp\" (UID: \"4828f2cf-8234-4289-aeae-65eb34fd3eee\") " pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.223688 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4828f2cf-8234-4289-aeae-65eb34fd3eee-utilities\") pod \"community-operators-6qclp\" (UID: \"4828f2cf-8234-4289-aeae-65eb34fd3eee\") " pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.223753 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4828f2cf-8234-4289-aeae-65eb34fd3eee-catalog-content\") pod \"community-operators-6qclp\" (UID: \"4828f2cf-8234-4289-aeae-65eb34fd3eee\") " pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.325264 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4828f2cf-8234-4289-aeae-65eb34fd3eee-utilities\") pod \"community-operators-6qclp\" (UID: \"4828f2cf-8234-4289-aeae-65eb34fd3eee\") " pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.325387 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4828f2cf-8234-4289-aeae-65eb34fd3eee-catalog-content\") pod \"community-operators-6qclp\" (UID: \"4828f2cf-8234-4289-aeae-65eb34fd3eee\") " pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.325469 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zt5s\" (UniqueName: \"kubernetes.io/projected/4828f2cf-8234-4289-aeae-65eb34fd3eee-kube-api-access-4zt5s\") pod \"community-operators-6qclp\" (UID: \"4828f2cf-8234-4289-aeae-65eb34fd3eee\") " pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.326002 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4828f2cf-8234-4289-aeae-65eb34fd3eee-utilities\") pod \"community-operators-6qclp\" (UID: \"4828f2cf-8234-4289-aeae-65eb34fd3eee\") " pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.326011 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4828f2cf-8234-4289-aeae-65eb34fd3eee-catalog-content\") pod \"community-operators-6qclp\" (UID: \"4828f2cf-8234-4289-aeae-65eb34fd3eee\") " pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.350071 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zt5s\" (UniqueName: \"kubernetes.io/projected/4828f2cf-8234-4289-aeae-65eb34fd3eee-kube-api-access-4zt5s\") pod \"community-operators-6qclp\" (UID: \"4828f2cf-8234-4289-aeae-65eb34fd3eee\") " pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.462091 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:38 crc kubenswrapper[4834]: I1008 22:59:38.938117 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6qclp"] Oct 08 22:59:39 crc kubenswrapper[4834]: I1008 22:59:39.240802 4834 generic.go:334] "Generic (PLEG): container finished" podID="4828f2cf-8234-4289-aeae-65eb34fd3eee" containerID="ec3bb39a08f4a11ce079ff6c3c4f1a4bf72456f42aa1b15f62655f21284bde9d" exitCode=0 Oct 08 22:59:39 crc kubenswrapper[4834]: I1008 22:59:39.240849 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qclp" event={"ID":"4828f2cf-8234-4289-aeae-65eb34fd3eee","Type":"ContainerDied","Data":"ec3bb39a08f4a11ce079ff6c3c4f1a4bf72456f42aa1b15f62655f21284bde9d"} Oct 08 22:59:39 crc kubenswrapper[4834]: I1008 22:59:39.240880 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qclp" event={"ID":"4828f2cf-8234-4289-aeae-65eb34fd3eee","Type":"ContainerStarted","Data":"e267b474c466301dff89266b99d0a6abf0429416fbc5c9cec52dde802312fced"} Oct 08 22:59:41 crc kubenswrapper[4834]: I1008 22:59:41.258185 4834 generic.go:334] "Generic (PLEG): container finished" podID="4828f2cf-8234-4289-aeae-65eb34fd3eee" containerID="bd217252d265b11191cb5e1899dc398a3e8391f430d9de5d37c7d16e8a1d8ae8" exitCode=0 Oct 08 22:59:41 crc kubenswrapper[4834]: I1008 22:59:41.258374 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qclp" event={"ID":"4828f2cf-8234-4289-aeae-65eb34fd3eee","Type":"ContainerDied","Data":"bd217252d265b11191cb5e1899dc398a3e8391f430d9de5d37c7d16e8a1d8ae8"} Oct 08 22:59:43 crc kubenswrapper[4834]: I1008 22:59:43.274217 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qclp" event={"ID":"4828f2cf-8234-4289-aeae-65eb34fd3eee","Type":"ContainerStarted","Data":"8a8b9f2b1ffdb83f3de3276e7ad592ed226e5c5c4122fa53e4070922ec2c12b5"} Oct 08 22:59:43 crc kubenswrapper[4834]: I1008 22:59:43.292961 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6qclp" podStartSLOduration=1.8455253470000001 podStartE2EDuration="5.292935592s" podCreationTimestamp="2025-10-08 22:59:38 +0000 UTC" firstStartedPulling="2025-10-08 22:59:39.243133055 +0000 UTC m=+2187.066017801" lastFinishedPulling="2025-10-08 22:59:42.69054326 +0000 UTC m=+2190.513428046" observedRunningTime="2025-10-08 22:59:43.289633022 +0000 UTC m=+2191.112517768" watchObservedRunningTime="2025-10-08 22:59:43.292935592 +0000 UTC m=+2191.115820358" Oct 08 22:59:47 crc kubenswrapper[4834]: I1008 22:59:47.025694 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 22:59:47 crc kubenswrapper[4834]: I1008 22:59:47.026222 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 22:59:48 crc kubenswrapper[4834]: I1008 22:59:48.463547 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:48 crc kubenswrapper[4834]: I1008 22:59:48.463890 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:48 crc kubenswrapper[4834]: I1008 22:59:48.505389 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:49 crc kubenswrapper[4834]: I1008 22:59:49.405619 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:49 crc kubenswrapper[4834]: I1008 22:59:49.468323 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6qclp"] Oct 08 22:59:51 crc kubenswrapper[4834]: I1008 22:59:51.345277 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6qclp" podUID="4828f2cf-8234-4289-aeae-65eb34fd3eee" containerName="registry-server" containerID="cri-o://8a8b9f2b1ffdb83f3de3276e7ad592ed226e5c5c4122fa53e4070922ec2c12b5" gracePeriod=2 Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.315655 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.363095 4834 generic.go:334] "Generic (PLEG): container finished" podID="4828f2cf-8234-4289-aeae-65eb34fd3eee" containerID="8a8b9f2b1ffdb83f3de3276e7ad592ed226e5c5c4122fa53e4070922ec2c12b5" exitCode=0 Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.363157 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qclp" event={"ID":"4828f2cf-8234-4289-aeae-65eb34fd3eee","Type":"ContainerDied","Data":"8a8b9f2b1ffdb83f3de3276e7ad592ed226e5c5c4122fa53e4070922ec2c12b5"} Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.363185 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qclp" event={"ID":"4828f2cf-8234-4289-aeae-65eb34fd3eee","Type":"ContainerDied","Data":"e267b474c466301dff89266b99d0a6abf0429416fbc5c9cec52dde802312fced"} Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.363202 4834 scope.go:117] "RemoveContainer" containerID="8a8b9f2b1ffdb83f3de3276e7ad592ed226e5c5c4122fa53e4070922ec2c12b5" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.363318 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qclp" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.386775 4834 scope.go:117] "RemoveContainer" containerID="bd217252d265b11191cb5e1899dc398a3e8391f430d9de5d37c7d16e8a1d8ae8" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.408673 4834 scope.go:117] "RemoveContainer" containerID="ec3bb39a08f4a11ce079ff6c3c4f1a4bf72456f42aa1b15f62655f21284bde9d" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.431949 4834 scope.go:117] "RemoveContainer" containerID="8a8b9f2b1ffdb83f3de3276e7ad592ed226e5c5c4122fa53e4070922ec2c12b5" Oct 08 22:59:52 crc kubenswrapper[4834]: E1008 22:59:52.432598 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a8b9f2b1ffdb83f3de3276e7ad592ed226e5c5c4122fa53e4070922ec2c12b5\": container with ID starting with 8a8b9f2b1ffdb83f3de3276e7ad592ed226e5c5c4122fa53e4070922ec2c12b5 not found: ID does not exist" containerID="8a8b9f2b1ffdb83f3de3276e7ad592ed226e5c5c4122fa53e4070922ec2c12b5" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.432635 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a8b9f2b1ffdb83f3de3276e7ad592ed226e5c5c4122fa53e4070922ec2c12b5"} err="failed to get container status \"8a8b9f2b1ffdb83f3de3276e7ad592ed226e5c5c4122fa53e4070922ec2c12b5\": rpc error: code = NotFound desc = could not find container \"8a8b9f2b1ffdb83f3de3276e7ad592ed226e5c5c4122fa53e4070922ec2c12b5\": container with ID starting with 8a8b9f2b1ffdb83f3de3276e7ad592ed226e5c5c4122fa53e4070922ec2c12b5 not found: ID does not exist" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.432670 4834 scope.go:117] "RemoveContainer" containerID="bd217252d265b11191cb5e1899dc398a3e8391f430d9de5d37c7d16e8a1d8ae8" Oct 08 22:59:52 crc kubenswrapper[4834]: E1008 22:59:52.433032 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd217252d265b11191cb5e1899dc398a3e8391f430d9de5d37c7d16e8a1d8ae8\": container with ID starting with bd217252d265b11191cb5e1899dc398a3e8391f430d9de5d37c7d16e8a1d8ae8 not found: ID does not exist" containerID="bd217252d265b11191cb5e1899dc398a3e8391f430d9de5d37c7d16e8a1d8ae8" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.433082 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd217252d265b11191cb5e1899dc398a3e8391f430d9de5d37c7d16e8a1d8ae8"} err="failed to get container status \"bd217252d265b11191cb5e1899dc398a3e8391f430d9de5d37c7d16e8a1d8ae8\": rpc error: code = NotFound desc = could not find container \"bd217252d265b11191cb5e1899dc398a3e8391f430d9de5d37c7d16e8a1d8ae8\": container with ID starting with bd217252d265b11191cb5e1899dc398a3e8391f430d9de5d37c7d16e8a1d8ae8 not found: ID does not exist" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.433117 4834 scope.go:117] "RemoveContainer" containerID="ec3bb39a08f4a11ce079ff6c3c4f1a4bf72456f42aa1b15f62655f21284bde9d" Oct 08 22:59:52 crc kubenswrapper[4834]: E1008 22:59:52.434275 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec3bb39a08f4a11ce079ff6c3c4f1a4bf72456f42aa1b15f62655f21284bde9d\": container with ID starting with ec3bb39a08f4a11ce079ff6c3c4f1a4bf72456f42aa1b15f62655f21284bde9d not found: ID does not exist" containerID="ec3bb39a08f4a11ce079ff6c3c4f1a4bf72456f42aa1b15f62655f21284bde9d" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.434350 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3bb39a08f4a11ce079ff6c3c4f1a4bf72456f42aa1b15f62655f21284bde9d"} err="failed to get container status \"ec3bb39a08f4a11ce079ff6c3c4f1a4bf72456f42aa1b15f62655f21284bde9d\": rpc error: code = NotFound desc = could not find container \"ec3bb39a08f4a11ce079ff6c3c4f1a4bf72456f42aa1b15f62655f21284bde9d\": container with ID starting with ec3bb39a08f4a11ce079ff6c3c4f1a4bf72456f42aa1b15f62655f21284bde9d not found: ID does not exist" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.437535 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4828f2cf-8234-4289-aeae-65eb34fd3eee-catalog-content\") pod \"4828f2cf-8234-4289-aeae-65eb34fd3eee\" (UID: \"4828f2cf-8234-4289-aeae-65eb34fd3eee\") " Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.437623 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zt5s\" (UniqueName: \"kubernetes.io/projected/4828f2cf-8234-4289-aeae-65eb34fd3eee-kube-api-access-4zt5s\") pod \"4828f2cf-8234-4289-aeae-65eb34fd3eee\" (UID: \"4828f2cf-8234-4289-aeae-65eb34fd3eee\") " Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.438017 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4828f2cf-8234-4289-aeae-65eb34fd3eee-utilities\") pod \"4828f2cf-8234-4289-aeae-65eb34fd3eee\" (UID: \"4828f2cf-8234-4289-aeae-65eb34fd3eee\") " Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.441380 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4828f2cf-8234-4289-aeae-65eb34fd3eee-utilities" (OuterVolumeSpecName: "utilities") pod "4828f2cf-8234-4289-aeae-65eb34fd3eee" (UID: "4828f2cf-8234-4289-aeae-65eb34fd3eee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.445473 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4828f2cf-8234-4289-aeae-65eb34fd3eee-kube-api-access-4zt5s" (OuterVolumeSpecName: "kube-api-access-4zt5s") pod "4828f2cf-8234-4289-aeae-65eb34fd3eee" (UID: "4828f2cf-8234-4289-aeae-65eb34fd3eee"). InnerVolumeSpecName "kube-api-access-4zt5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.498488 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4828f2cf-8234-4289-aeae-65eb34fd3eee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4828f2cf-8234-4289-aeae-65eb34fd3eee" (UID: "4828f2cf-8234-4289-aeae-65eb34fd3eee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.540670 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4828f2cf-8234-4289-aeae-65eb34fd3eee-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.540703 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zt5s\" (UniqueName: \"kubernetes.io/projected/4828f2cf-8234-4289-aeae-65eb34fd3eee-kube-api-access-4zt5s\") on node \"crc\" DevicePath \"\"" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.540739 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4828f2cf-8234-4289-aeae-65eb34fd3eee-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.696191 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6qclp"] Oct 08 22:59:52 crc kubenswrapper[4834]: I1008 22:59:52.703711 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6qclp"] Oct 08 22:59:53 crc kubenswrapper[4834]: I1008 22:59:53.571029 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4828f2cf-8234-4289-aeae-65eb34fd3eee" path="/var/lib/kubelet/pods/4828f2cf-8234-4289-aeae-65eb34fd3eee/volumes" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.166381 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp"] Oct 08 23:00:00 crc kubenswrapper[4834]: E1008 23:00:00.167303 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4828f2cf-8234-4289-aeae-65eb34fd3eee" containerName="extract-content" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.167329 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4828f2cf-8234-4289-aeae-65eb34fd3eee" containerName="extract-content" Oct 08 23:00:00 crc kubenswrapper[4834]: E1008 23:00:00.167357 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4828f2cf-8234-4289-aeae-65eb34fd3eee" containerName="extract-utilities" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.167371 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4828f2cf-8234-4289-aeae-65eb34fd3eee" containerName="extract-utilities" Oct 08 23:00:00 crc kubenswrapper[4834]: E1008 23:00:00.167413 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4828f2cf-8234-4289-aeae-65eb34fd3eee" containerName="registry-server" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.167427 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="4828f2cf-8234-4289-aeae-65eb34fd3eee" containerName="registry-server" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.167735 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="4828f2cf-8234-4289-aeae-65eb34fd3eee" containerName="registry-server" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.168552 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.171059 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-846rq\" (UniqueName: \"kubernetes.io/projected/6ca2c493-23f8-42c9-be1e-968ebe02de13-kube-api-access-846rq\") pod \"collect-profiles-29332740-7f6kp\" (UID: \"6ca2c493-23f8-42c9-be1e-968ebe02de13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.171381 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ca2c493-23f8-42c9-be1e-968ebe02de13-config-volume\") pod \"collect-profiles-29332740-7f6kp\" (UID: \"6ca2c493-23f8-42c9-be1e-968ebe02de13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.171539 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ca2c493-23f8-42c9-be1e-968ebe02de13-secret-volume\") pod \"collect-profiles-29332740-7f6kp\" (UID: \"6ca2c493-23f8-42c9-be1e-968ebe02de13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.179213 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp"] Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.185017 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.185471 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.272570 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-846rq\" (UniqueName: \"kubernetes.io/projected/6ca2c493-23f8-42c9-be1e-968ebe02de13-kube-api-access-846rq\") pod \"collect-profiles-29332740-7f6kp\" (UID: \"6ca2c493-23f8-42c9-be1e-968ebe02de13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.272868 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ca2c493-23f8-42c9-be1e-968ebe02de13-config-volume\") pod \"collect-profiles-29332740-7f6kp\" (UID: \"6ca2c493-23f8-42c9-be1e-968ebe02de13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.272899 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ca2c493-23f8-42c9-be1e-968ebe02de13-secret-volume\") pod \"collect-profiles-29332740-7f6kp\" (UID: \"6ca2c493-23f8-42c9-be1e-968ebe02de13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.273877 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ca2c493-23f8-42c9-be1e-968ebe02de13-config-volume\") pod \"collect-profiles-29332740-7f6kp\" (UID: \"6ca2c493-23f8-42c9-be1e-968ebe02de13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.278813 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ca2c493-23f8-42c9-be1e-968ebe02de13-secret-volume\") pod \"collect-profiles-29332740-7f6kp\" (UID: \"6ca2c493-23f8-42c9-be1e-968ebe02de13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.288619 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-846rq\" (UniqueName: \"kubernetes.io/projected/6ca2c493-23f8-42c9-be1e-968ebe02de13-kube-api-access-846rq\") pod \"collect-profiles-29332740-7f6kp\" (UID: \"6ca2c493-23f8-42c9-be1e-968ebe02de13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.512916 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" Oct 08 23:00:00 crc kubenswrapper[4834]: I1008 23:00:00.927826 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp"] Oct 08 23:00:01 crc kubenswrapper[4834]: I1008 23:00:01.443800 4834 generic.go:334] "Generic (PLEG): container finished" podID="6ca2c493-23f8-42c9-be1e-968ebe02de13" containerID="44439a7551b735867dceb42bf54e6fe469833e1462650e3425630a21dbd2b055" exitCode=0 Oct 08 23:00:01 crc kubenswrapper[4834]: I1008 23:00:01.443842 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" event={"ID":"6ca2c493-23f8-42c9-be1e-968ebe02de13","Type":"ContainerDied","Data":"44439a7551b735867dceb42bf54e6fe469833e1462650e3425630a21dbd2b055"} Oct 08 23:00:01 crc kubenswrapper[4834]: I1008 23:00:01.443866 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" event={"ID":"6ca2c493-23f8-42c9-be1e-968ebe02de13","Type":"ContainerStarted","Data":"6ea8f934e13e45792602003ac43066eaa069f239fdb9ce33e8c4b0e836f8decf"} Oct 08 23:00:02 crc kubenswrapper[4834]: I1008 23:00:02.743987 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" Oct 08 23:00:02 crc kubenswrapper[4834]: I1008 23:00:02.906443 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ca2c493-23f8-42c9-be1e-968ebe02de13-secret-volume\") pod \"6ca2c493-23f8-42c9-be1e-968ebe02de13\" (UID: \"6ca2c493-23f8-42c9-be1e-968ebe02de13\") " Oct 08 23:00:02 crc kubenswrapper[4834]: I1008 23:00:02.906575 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ca2c493-23f8-42c9-be1e-968ebe02de13-config-volume\") pod \"6ca2c493-23f8-42c9-be1e-968ebe02de13\" (UID: \"6ca2c493-23f8-42c9-be1e-968ebe02de13\") " Oct 08 23:00:02 crc kubenswrapper[4834]: I1008 23:00:02.906734 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-846rq\" (UniqueName: \"kubernetes.io/projected/6ca2c493-23f8-42c9-be1e-968ebe02de13-kube-api-access-846rq\") pod \"6ca2c493-23f8-42c9-be1e-968ebe02de13\" (UID: \"6ca2c493-23f8-42c9-be1e-968ebe02de13\") " Oct 08 23:00:02 crc kubenswrapper[4834]: I1008 23:00:02.909713 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ca2c493-23f8-42c9-be1e-968ebe02de13-config-volume" (OuterVolumeSpecName: "config-volume") pod "6ca2c493-23f8-42c9-be1e-968ebe02de13" (UID: "6ca2c493-23f8-42c9-be1e-968ebe02de13"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 23:00:02 crc kubenswrapper[4834]: I1008 23:00:02.913836 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ca2c493-23f8-42c9-be1e-968ebe02de13-kube-api-access-846rq" (OuterVolumeSpecName: "kube-api-access-846rq") pod "6ca2c493-23f8-42c9-be1e-968ebe02de13" (UID: "6ca2c493-23f8-42c9-be1e-968ebe02de13"). InnerVolumeSpecName "kube-api-access-846rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:00:02 crc kubenswrapper[4834]: I1008 23:00:02.914340 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ca2c493-23f8-42c9-be1e-968ebe02de13-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6ca2c493-23f8-42c9-be1e-968ebe02de13" (UID: "6ca2c493-23f8-42c9-be1e-968ebe02de13"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 23:00:03 crc kubenswrapper[4834]: I1008 23:00:03.008191 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-846rq\" (UniqueName: \"kubernetes.io/projected/6ca2c493-23f8-42c9-be1e-968ebe02de13-kube-api-access-846rq\") on node \"crc\" DevicePath \"\"" Oct 08 23:00:03 crc kubenswrapper[4834]: I1008 23:00:03.008227 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6ca2c493-23f8-42c9-be1e-968ebe02de13-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 23:00:03 crc kubenswrapper[4834]: I1008 23:00:03.008243 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ca2c493-23f8-42c9-be1e-968ebe02de13-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 23:00:03 crc kubenswrapper[4834]: I1008 23:00:03.463681 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" event={"ID":"6ca2c493-23f8-42c9-be1e-968ebe02de13","Type":"ContainerDied","Data":"6ea8f934e13e45792602003ac43066eaa069f239fdb9ce33e8c4b0e836f8decf"} Oct 08 23:00:03 crc kubenswrapper[4834]: I1008 23:00:03.463741 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ea8f934e13e45792602003ac43066eaa069f239fdb9ce33e8c4b0e836f8decf" Oct 08 23:00:03 crc kubenswrapper[4834]: I1008 23:00:03.463799 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp" Oct 08 23:00:03 crc kubenswrapper[4834]: I1008 23:00:03.851667 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb"] Oct 08 23:00:03 crc kubenswrapper[4834]: I1008 23:00:03.858719 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332695-bh7rb"] Oct 08 23:00:05 crc kubenswrapper[4834]: I1008 23:00:05.572451 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c302d50-f83a-448d-a914-905ec04ada98" path="/var/lib/kubelet/pods/3c302d50-f83a-448d-a914-905ec04ada98/volumes" Oct 08 23:00:17 crc kubenswrapper[4834]: I1008 23:00:17.025141 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:00:17 crc kubenswrapper[4834]: I1008 23:00:17.027078 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:00:21 crc kubenswrapper[4834]: I1008 23:00:21.878378 4834 scope.go:117] "RemoveContainer" containerID="d60f771dc7d8aa571922d2bdcb0cccf4509c88a3299a79da67ff81630f1770ca" Oct 08 23:00:47 crc kubenswrapper[4834]: I1008 23:00:47.025668 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:00:47 crc kubenswrapper[4834]: I1008 23:00:47.030057 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:00:47 crc kubenswrapper[4834]: I1008 23:00:47.030640 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 23:00:47 crc kubenswrapper[4834]: I1008 23:00:47.032075 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 23:00:47 crc kubenswrapper[4834]: I1008 23:00:47.032449 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" gracePeriod=600 Oct 08 23:00:47 crc kubenswrapper[4834]: E1008 23:00:47.166375 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:00:47 crc kubenswrapper[4834]: I1008 23:00:47.876358 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" exitCode=0 Oct 08 23:00:47 crc kubenswrapper[4834]: I1008 23:00:47.876416 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c"} Oct 08 23:00:47 crc kubenswrapper[4834]: I1008 23:00:47.876492 4834 scope.go:117] "RemoveContainer" containerID="ab28162cf4c7e6fb4dcd8017cbd69eee9eeb5c16c85603fb0a65c070f9b16210" Oct 08 23:00:47 crc kubenswrapper[4834]: I1008 23:00:47.878998 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:00:47 crc kubenswrapper[4834]: E1008 23:00:47.879630 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:01:02 crc kubenswrapper[4834]: I1008 23:01:02.556578 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:01:02 crc kubenswrapper[4834]: E1008 23:01:02.559644 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:01:16 crc kubenswrapper[4834]: I1008 23:01:16.556614 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:01:16 crc kubenswrapper[4834]: E1008 23:01:16.557342 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:01:29 crc kubenswrapper[4834]: I1008 23:01:29.555101 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:01:29 crc kubenswrapper[4834]: E1008 23:01:29.555869 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:01:42 crc kubenswrapper[4834]: I1008 23:01:42.555722 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:01:42 crc kubenswrapper[4834]: E1008 23:01:42.556635 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:01:55 crc kubenswrapper[4834]: I1008 23:01:55.561930 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:01:55 crc kubenswrapper[4834]: E1008 23:01:55.562769 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:02:10 crc kubenswrapper[4834]: I1008 23:02:10.555925 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:02:10 crc kubenswrapper[4834]: E1008 23:02:10.556569 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:02:23 crc kubenswrapper[4834]: I1008 23:02:23.563750 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:02:23 crc kubenswrapper[4834]: E1008 23:02:23.565046 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:02:37 crc kubenswrapper[4834]: I1008 23:02:37.555789 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:02:37 crc kubenswrapper[4834]: E1008 23:02:37.556729 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:02:52 crc kubenswrapper[4834]: I1008 23:02:52.556993 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:02:52 crc kubenswrapper[4834]: E1008 23:02:52.557803 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:03:07 crc kubenswrapper[4834]: I1008 23:03:07.556557 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:03:07 crc kubenswrapper[4834]: E1008 23:03:07.557458 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:03:21 crc kubenswrapper[4834]: I1008 23:03:21.555700 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:03:21 crc kubenswrapper[4834]: E1008 23:03:21.556774 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:03:36 crc kubenswrapper[4834]: I1008 23:03:36.555948 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:03:36 crc kubenswrapper[4834]: E1008 23:03:36.556662 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:03:49 crc kubenswrapper[4834]: I1008 23:03:49.555589 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:03:49 crc kubenswrapper[4834]: E1008 23:03:49.556611 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:04:02 crc kubenswrapper[4834]: I1008 23:04:02.555188 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:04:02 crc kubenswrapper[4834]: E1008 23:04:02.555923 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:04:14 crc kubenswrapper[4834]: I1008 23:04:14.556884 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:04:14 crc kubenswrapper[4834]: E1008 23:04:14.557656 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:04:29 crc kubenswrapper[4834]: I1008 23:04:29.556369 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:04:29 crc kubenswrapper[4834]: E1008 23:04:29.557528 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:04:41 crc kubenswrapper[4834]: I1008 23:04:41.555164 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:04:41 crc kubenswrapper[4834]: E1008 23:04:41.555784 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:04:54 crc kubenswrapper[4834]: I1008 23:04:54.556486 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:04:54 crc kubenswrapper[4834]: E1008 23:04:54.557577 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:05:09 crc kubenswrapper[4834]: I1008 23:05:09.555651 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:05:09 crc kubenswrapper[4834]: E1008 23:05:09.556543 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:05:21 crc kubenswrapper[4834]: I1008 23:05:21.555321 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:05:21 crc kubenswrapper[4834]: E1008 23:05:21.556307 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:05:22 crc kubenswrapper[4834]: I1008 23:05:22.049363 4834 scope.go:117] "RemoveContainer" containerID="584deebc29649c86781cb018afc560d0bb3f08c52b9daea7949aad99bc9994a0" Oct 08 23:05:22 crc kubenswrapper[4834]: I1008 23:05:22.079551 4834 scope.go:117] "RemoveContainer" containerID="a2386e31e4e479805c4c6f7af4e2fb5b214804c887f1fc1dc096c6f4a721995c" Oct 08 23:05:22 crc kubenswrapper[4834]: I1008 23:05:22.104022 4834 scope.go:117] "RemoveContainer" containerID="84b1f9520f28694537d82e45fc15f030ccbd539a744b9329954e4f62fac7ce70" Oct 08 23:05:33 crc kubenswrapper[4834]: I1008 23:05:33.566984 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:05:33 crc kubenswrapper[4834]: E1008 23:05:33.568864 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:05:44 crc kubenswrapper[4834]: I1008 23:05:44.556349 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:05:44 crc kubenswrapper[4834]: E1008 23:05:44.557206 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:05:55 crc kubenswrapper[4834]: I1008 23:05:55.555510 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:05:56 crc kubenswrapper[4834]: I1008 23:05:56.697213 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"67683b6ce7035acf55b3080a06c7f20aa9af2ca455ec79f64b2a695a576915ad"} Oct 08 23:08:17 crc kubenswrapper[4834]: I1008 23:08:17.025888 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:08:17 crc kubenswrapper[4834]: I1008 23:08:17.026559 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.148594 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mwwhv"] Oct 08 23:08:31 crc kubenswrapper[4834]: E1008 23:08:31.149921 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca2c493-23f8-42c9-be1e-968ebe02de13" containerName="collect-profiles" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.149940 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca2c493-23f8-42c9-be1e-968ebe02de13" containerName="collect-profiles" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.150248 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ca2c493-23f8-42c9-be1e-968ebe02de13" containerName="collect-profiles" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.157320 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.178613 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwwhv"] Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.273464 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a04422-ab6d-4e6b-88d5-0692ca08b439-utilities\") pod \"certified-operators-mwwhv\" (UID: \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\") " pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.273739 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a04422-ab6d-4e6b-88d5-0692ca08b439-catalog-content\") pod \"certified-operators-mwwhv\" (UID: \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\") " pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.273869 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq2kx\" (UniqueName: \"kubernetes.io/projected/c8a04422-ab6d-4e6b-88d5-0692ca08b439-kube-api-access-kq2kx\") pod \"certified-operators-mwwhv\" (UID: \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\") " pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.375221 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a04422-ab6d-4e6b-88d5-0692ca08b439-utilities\") pod \"certified-operators-mwwhv\" (UID: \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\") " pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.375349 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a04422-ab6d-4e6b-88d5-0692ca08b439-catalog-content\") pod \"certified-operators-mwwhv\" (UID: \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\") " pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.375402 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq2kx\" (UniqueName: \"kubernetes.io/projected/c8a04422-ab6d-4e6b-88d5-0692ca08b439-kube-api-access-kq2kx\") pod \"certified-operators-mwwhv\" (UID: \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\") " pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.376086 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a04422-ab6d-4e6b-88d5-0692ca08b439-catalog-content\") pod \"certified-operators-mwwhv\" (UID: \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\") " pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.376102 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a04422-ab6d-4e6b-88d5-0692ca08b439-utilities\") pod \"certified-operators-mwwhv\" (UID: \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\") " pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.401296 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq2kx\" (UniqueName: \"kubernetes.io/projected/c8a04422-ab6d-4e6b-88d5-0692ca08b439-kube-api-access-kq2kx\") pod \"certified-operators-mwwhv\" (UID: \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\") " pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.484750 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:31 crc kubenswrapper[4834]: I1008 23:08:31.985369 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwwhv"] Oct 08 23:08:32 crc kubenswrapper[4834]: I1008 23:08:32.081966 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwwhv" event={"ID":"c8a04422-ab6d-4e6b-88d5-0692ca08b439","Type":"ContainerStarted","Data":"e77db29e38255c55f65981e0d786a75dbadc5cbc83ba2b8ccbaf848944c4b07f"} Oct 08 23:08:33 crc kubenswrapper[4834]: I1008 23:08:33.091663 4834 generic.go:334] "Generic (PLEG): container finished" podID="c8a04422-ab6d-4e6b-88d5-0692ca08b439" containerID="358a557201197b7b6948dcada844ae845b4c76eb819abb106bfa65b36860efce" exitCode=0 Oct 08 23:08:33 crc kubenswrapper[4834]: I1008 23:08:33.091710 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwwhv" event={"ID":"c8a04422-ab6d-4e6b-88d5-0692ca08b439","Type":"ContainerDied","Data":"358a557201197b7b6948dcada844ae845b4c76eb819abb106bfa65b36860efce"} Oct 08 23:08:33 crc kubenswrapper[4834]: I1008 23:08:33.094190 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 23:08:34 crc kubenswrapper[4834]: I1008 23:08:34.104667 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwwhv" event={"ID":"c8a04422-ab6d-4e6b-88d5-0692ca08b439","Type":"ContainerStarted","Data":"8d3cf2ecd025e6ba2c2be650c3fb82ea8c017d6731a1a5b7ed261766fc28c5dc"} Oct 08 23:08:35 crc kubenswrapper[4834]: I1008 23:08:35.113307 4834 generic.go:334] "Generic (PLEG): container finished" podID="c8a04422-ab6d-4e6b-88d5-0692ca08b439" containerID="8d3cf2ecd025e6ba2c2be650c3fb82ea8c017d6731a1a5b7ed261766fc28c5dc" exitCode=0 Oct 08 23:08:35 crc kubenswrapper[4834]: I1008 23:08:35.113388 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwwhv" event={"ID":"c8a04422-ab6d-4e6b-88d5-0692ca08b439","Type":"ContainerDied","Data":"8d3cf2ecd025e6ba2c2be650c3fb82ea8c017d6731a1a5b7ed261766fc28c5dc"} Oct 08 23:08:36 crc kubenswrapper[4834]: I1008 23:08:36.123381 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwwhv" event={"ID":"c8a04422-ab6d-4e6b-88d5-0692ca08b439","Type":"ContainerStarted","Data":"a04c60d48e1c9e440685548a3b2604b149ac16ff78ecd0e513c195c00879d14f"} Oct 08 23:08:36 crc kubenswrapper[4834]: I1008 23:08:36.144583 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mwwhv" podStartSLOduration=2.696332004 podStartE2EDuration="5.144566736s" podCreationTimestamp="2025-10-08 23:08:31 +0000 UTC" firstStartedPulling="2025-10-08 23:08:33.09385724 +0000 UTC m=+2720.916741986" lastFinishedPulling="2025-10-08 23:08:35.542091962 +0000 UTC m=+2723.364976718" observedRunningTime="2025-10-08 23:08:36.138536639 +0000 UTC m=+2723.961421395" watchObservedRunningTime="2025-10-08 23:08:36.144566736 +0000 UTC m=+2723.967451482" Oct 08 23:08:41 crc kubenswrapper[4834]: I1008 23:08:41.484997 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:41 crc kubenswrapper[4834]: I1008 23:08:41.485407 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:41 crc kubenswrapper[4834]: I1008 23:08:41.573828 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:42 crc kubenswrapper[4834]: I1008 23:08:42.254699 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:42 crc kubenswrapper[4834]: I1008 23:08:42.320255 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwwhv"] Oct 08 23:08:44 crc kubenswrapper[4834]: I1008 23:08:44.194227 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mwwhv" podUID="c8a04422-ab6d-4e6b-88d5-0692ca08b439" containerName="registry-server" containerID="cri-o://a04c60d48e1c9e440685548a3b2604b149ac16ff78ecd0e513c195c00879d14f" gracePeriod=2 Oct 08 23:08:45 crc kubenswrapper[4834]: I1008 23:08:45.205678 4834 generic.go:334] "Generic (PLEG): container finished" podID="c8a04422-ab6d-4e6b-88d5-0692ca08b439" containerID="a04c60d48e1c9e440685548a3b2604b149ac16ff78ecd0e513c195c00879d14f" exitCode=0 Oct 08 23:08:45 crc kubenswrapper[4834]: I1008 23:08:45.205939 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwwhv" event={"ID":"c8a04422-ab6d-4e6b-88d5-0692ca08b439","Type":"ContainerDied","Data":"a04c60d48e1c9e440685548a3b2604b149ac16ff78ecd0e513c195c00879d14f"} Oct 08 23:08:45 crc kubenswrapper[4834]: I1008 23:08:45.256068 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:45 crc kubenswrapper[4834]: I1008 23:08:45.288980 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a04422-ab6d-4e6b-88d5-0692ca08b439-utilities\") pod \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\" (UID: \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\") " Oct 08 23:08:45 crc kubenswrapper[4834]: I1008 23:08:45.289069 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq2kx\" (UniqueName: \"kubernetes.io/projected/c8a04422-ab6d-4e6b-88d5-0692ca08b439-kube-api-access-kq2kx\") pod \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\" (UID: \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\") " Oct 08 23:08:45 crc kubenswrapper[4834]: I1008 23:08:45.289111 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a04422-ab6d-4e6b-88d5-0692ca08b439-catalog-content\") pod \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\" (UID: \"c8a04422-ab6d-4e6b-88d5-0692ca08b439\") " Oct 08 23:08:45 crc kubenswrapper[4834]: I1008 23:08:45.291099 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8a04422-ab6d-4e6b-88d5-0692ca08b439-utilities" (OuterVolumeSpecName: "utilities") pod "c8a04422-ab6d-4e6b-88d5-0692ca08b439" (UID: "c8a04422-ab6d-4e6b-88d5-0692ca08b439"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:08:45 crc kubenswrapper[4834]: I1008 23:08:45.301330 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a04422-ab6d-4e6b-88d5-0692ca08b439-kube-api-access-kq2kx" (OuterVolumeSpecName: "kube-api-access-kq2kx") pod "c8a04422-ab6d-4e6b-88d5-0692ca08b439" (UID: "c8a04422-ab6d-4e6b-88d5-0692ca08b439"). InnerVolumeSpecName "kube-api-access-kq2kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:08:45 crc kubenswrapper[4834]: I1008 23:08:45.333480 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8a04422-ab6d-4e6b-88d5-0692ca08b439-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8a04422-ab6d-4e6b-88d5-0692ca08b439" (UID: "c8a04422-ab6d-4e6b-88d5-0692ca08b439"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:08:45 crc kubenswrapper[4834]: I1008 23:08:45.391883 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8a04422-ab6d-4e6b-88d5-0692ca08b439-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:08:45 crc kubenswrapper[4834]: I1008 23:08:45.391940 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq2kx\" (UniqueName: \"kubernetes.io/projected/c8a04422-ab6d-4e6b-88d5-0692ca08b439-kube-api-access-kq2kx\") on node \"crc\" DevicePath \"\"" Oct 08 23:08:45 crc kubenswrapper[4834]: I1008 23:08:45.391953 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8a04422-ab6d-4e6b-88d5-0692ca08b439-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:08:46 crc kubenswrapper[4834]: I1008 23:08:46.221466 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwwhv" event={"ID":"c8a04422-ab6d-4e6b-88d5-0692ca08b439","Type":"ContainerDied","Data":"e77db29e38255c55f65981e0d786a75dbadc5cbc83ba2b8ccbaf848944c4b07f"} Oct 08 23:08:46 crc kubenswrapper[4834]: I1008 23:08:46.221864 4834 scope.go:117] "RemoveContainer" containerID="a04c60d48e1c9e440685548a3b2604b149ac16ff78ecd0e513c195c00879d14f" Oct 08 23:08:46 crc kubenswrapper[4834]: I1008 23:08:46.221594 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwwhv" Oct 08 23:08:46 crc kubenswrapper[4834]: I1008 23:08:46.261863 4834 scope.go:117] "RemoveContainer" containerID="8d3cf2ecd025e6ba2c2be650c3fb82ea8c017d6731a1a5b7ed261766fc28c5dc" Oct 08 23:08:46 crc kubenswrapper[4834]: I1008 23:08:46.262071 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwwhv"] Oct 08 23:08:46 crc kubenswrapper[4834]: I1008 23:08:46.279008 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mwwhv"] Oct 08 23:08:46 crc kubenswrapper[4834]: I1008 23:08:46.336991 4834 scope.go:117] "RemoveContainer" containerID="358a557201197b7b6948dcada844ae845b4c76eb819abb106bfa65b36860efce" Oct 08 23:08:47 crc kubenswrapper[4834]: I1008 23:08:47.026456 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:08:47 crc kubenswrapper[4834]: I1008 23:08:47.026675 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:08:47 crc kubenswrapper[4834]: I1008 23:08:47.575405 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a04422-ab6d-4e6b-88d5-0692ca08b439" path="/var/lib/kubelet/pods/c8a04422-ab6d-4e6b-88d5-0692ca08b439/volumes" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.247738 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2cnlf"] Oct 08 23:08:51 crc kubenswrapper[4834]: E1008 23:08:51.249673 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a04422-ab6d-4e6b-88d5-0692ca08b439" containerName="registry-server" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.250065 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a04422-ab6d-4e6b-88d5-0692ca08b439" containerName="registry-server" Oct 08 23:08:51 crc kubenswrapper[4834]: E1008 23:08:51.250198 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a04422-ab6d-4e6b-88d5-0692ca08b439" containerName="extract-utilities" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.250262 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a04422-ab6d-4e6b-88d5-0692ca08b439" containerName="extract-utilities" Oct 08 23:08:51 crc kubenswrapper[4834]: E1008 23:08:51.250323 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a04422-ab6d-4e6b-88d5-0692ca08b439" containerName="extract-content" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.250715 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a04422-ab6d-4e6b-88d5-0692ca08b439" containerName="extract-content" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.250988 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a04422-ab6d-4e6b-88d5-0692ca08b439" containerName="registry-server" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.252412 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.271294 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2cnlf"] Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.285937 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-catalog-content\") pod \"redhat-operators-2cnlf\" (UID: \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\") " pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.286120 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnkhm\" (UniqueName: \"kubernetes.io/projected/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-kube-api-access-bnkhm\") pod \"redhat-operators-2cnlf\" (UID: \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\") " pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.286265 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-utilities\") pod \"redhat-operators-2cnlf\" (UID: \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\") " pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.387985 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-utilities\") pod \"redhat-operators-2cnlf\" (UID: \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\") " pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.388386 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-catalog-content\") pod \"redhat-operators-2cnlf\" (UID: \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\") " pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.388600 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnkhm\" (UniqueName: \"kubernetes.io/projected/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-kube-api-access-bnkhm\") pod \"redhat-operators-2cnlf\" (UID: \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\") " pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.389103 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-utilities\") pod \"redhat-operators-2cnlf\" (UID: \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\") " pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.389223 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-catalog-content\") pod \"redhat-operators-2cnlf\" (UID: \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\") " pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.424783 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnkhm\" (UniqueName: \"kubernetes.io/projected/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-kube-api-access-bnkhm\") pod \"redhat-operators-2cnlf\" (UID: \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\") " pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.596051 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:08:51 crc kubenswrapper[4834]: I1008 23:08:51.855814 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2cnlf"] Oct 08 23:08:52 crc kubenswrapper[4834]: I1008 23:08:52.283543 4834 generic.go:334] "Generic (PLEG): container finished" podID="0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" containerID="073982e8e480e51dfd757726700631b78c2e54e0c33ef2a32553c4c34b8f488e" exitCode=0 Oct 08 23:08:52 crc kubenswrapper[4834]: I1008 23:08:52.283616 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cnlf" event={"ID":"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245","Type":"ContainerDied","Data":"073982e8e480e51dfd757726700631b78c2e54e0c33ef2a32553c4c34b8f488e"} Oct 08 23:08:52 crc kubenswrapper[4834]: I1008 23:08:52.283661 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cnlf" event={"ID":"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245","Type":"ContainerStarted","Data":"ec5764855876e9e038a7052c86af08fd584e910d388795aee28f0bdccbf21cba"} Oct 08 23:08:53 crc kubenswrapper[4834]: I1008 23:08:53.293231 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cnlf" event={"ID":"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245","Type":"ContainerStarted","Data":"b18b836bef39840aa46dd98ea161b4dfae443aeec564d1e0e8e7068fcbdb8172"} Oct 08 23:08:54 crc kubenswrapper[4834]: I1008 23:08:54.306811 4834 generic.go:334] "Generic (PLEG): container finished" podID="0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" containerID="b18b836bef39840aa46dd98ea161b4dfae443aeec564d1e0e8e7068fcbdb8172" exitCode=0 Oct 08 23:08:54 crc kubenswrapper[4834]: I1008 23:08:54.306881 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cnlf" event={"ID":"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245","Type":"ContainerDied","Data":"b18b836bef39840aa46dd98ea161b4dfae443aeec564d1e0e8e7068fcbdb8172"} Oct 08 23:08:55 crc kubenswrapper[4834]: I1008 23:08:55.322114 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cnlf" event={"ID":"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245","Type":"ContainerStarted","Data":"eca222706819092268a6b0f7524e1b705512a7d4300348f44a214df405ed2a0a"} Oct 08 23:08:55 crc kubenswrapper[4834]: I1008 23:08:55.344839 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2cnlf" podStartSLOduration=1.899149756 podStartE2EDuration="4.344803115s" podCreationTimestamp="2025-10-08 23:08:51 +0000 UTC" firstStartedPulling="2025-10-08 23:08:52.285619112 +0000 UTC m=+2740.108503868" lastFinishedPulling="2025-10-08 23:08:54.731272471 +0000 UTC m=+2742.554157227" observedRunningTime="2025-10-08 23:08:55.343678257 +0000 UTC m=+2743.166563013" watchObservedRunningTime="2025-10-08 23:08:55.344803115 +0000 UTC m=+2743.167687881" Oct 08 23:09:01 crc kubenswrapper[4834]: I1008 23:09:01.596195 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:09:01 crc kubenswrapper[4834]: I1008 23:09:01.596540 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:09:01 crc kubenswrapper[4834]: I1008 23:09:01.674658 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:09:02 crc kubenswrapper[4834]: I1008 23:09:02.438818 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:09:02 crc kubenswrapper[4834]: I1008 23:09:02.509604 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2cnlf"] Oct 08 23:09:04 crc kubenswrapper[4834]: I1008 23:09:04.404328 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2cnlf" podUID="0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" containerName="registry-server" containerID="cri-o://eca222706819092268a6b0f7524e1b705512a7d4300348f44a214df405ed2a0a" gracePeriod=2 Oct 08 23:09:05 crc kubenswrapper[4834]: I1008 23:09:05.418317 4834 generic.go:334] "Generic (PLEG): container finished" podID="0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" containerID="eca222706819092268a6b0f7524e1b705512a7d4300348f44a214df405ed2a0a" exitCode=0 Oct 08 23:09:05 crc kubenswrapper[4834]: I1008 23:09:05.418392 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cnlf" event={"ID":"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245","Type":"ContainerDied","Data":"eca222706819092268a6b0f7524e1b705512a7d4300348f44a214df405ed2a0a"} Oct 08 23:09:05 crc kubenswrapper[4834]: I1008 23:09:05.973383 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.018831 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-catalog-content\") pod \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\" (UID: \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\") " Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.018896 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnkhm\" (UniqueName: \"kubernetes.io/projected/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-kube-api-access-bnkhm\") pod \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\" (UID: \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\") " Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.018948 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-utilities\") pod \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\" (UID: \"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245\") " Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.020504 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-utilities" (OuterVolumeSpecName: "utilities") pod "0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" (UID: "0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.027889 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-kube-api-access-bnkhm" (OuterVolumeSpecName: "kube-api-access-bnkhm") pod "0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" (UID: "0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245"). InnerVolumeSpecName "kube-api-access-bnkhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.114183 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" (UID: "0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.121408 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.121432 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnkhm\" (UniqueName: \"kubernetes.io/projected/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-kube-api-access-bnkhm\") on node \"crc\" DevicePath \"\"" Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.121443 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.430627 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2cnlf" event={"ID":"0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245","Type":"ContainerDied","Data":"ec5764855876e9e038a7052c86af08fd584e910d388795aee28f0bdccbf21cba"} Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.431254 4834 scope.go:117] "RemoveContainer" containerID="eca222706819092268a6b0f7524e1b705512a7d4300348f44a214df405ed2a0a" Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.430740 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2cnlf" Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.451834 4834 scope.go:117] "RemoveContainer" containerID="b18b836bef39840aa46dd98ea161b4dfae443aeec564d1e0e8e7068fcbdb8172" Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.479100 4834 scope.go:117] "RemoveContainer" containerID="073982e8e480e51dfd757726700631b78c2e54e0c33ef2a32553c4c34b8f488e" Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.481748 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2cnlf"] Oct 08 23:09:06 crc kubenswrapper[4834]: I1008 23:09:06.487312 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2cnlf"] Oct 08 23:09:07 crc kubenswrapper[4834]: I1008 23:09:07.571974 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" path="/var/lib/kubelet/pods/0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245/volumes" Oct 08 23:09:17 crc kubenswrapper[4834]: I1008 23:09:17.026449 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:09:17 crc kubenswrapper[4834]: I1008 23:09:17.027456 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:09:17 crc kubenswrapper[4834]: I1008 23:09:17.027546 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 23:09:17 crc kubenswrapper[4834]: I1008 23:09:17.028655 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67683b6ce7035acf55b3080a06c7f20aa9af2ca455ec79f64b2a695a576915ad"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 23:09:17 crc kubenswrapper[4834]: I1008 23:09:17.028800 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://67683b6ce7035acf55b3080a06c7f20aa9af2ca455ec79f64b2a695a576915ad" gracePeriod=600 Oct 08 23:09:17 crc kubenswrapper[4834]: I1008 23:09:17.557889 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="67683b6ce7035acf55b3080a06c7f20aa9af2ca455ec79f64b2a695a576915ad" exitCode=0 Oct 08 23:09:17 crc kubenswrapper[4834]: I1008 23:09:17.565949 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"67683b6ce7035acf55b3080a06c7f20aa9af2ca455ec79f64b2a695a576915ad"} Oct 08 23:09:17 crc kubenswrapper[4834]: I1008 23:09:17.566000 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370"} Oct 08 23:09:17 crc kubenswrapper[4834]: I1008 23:09:17.566017 4834 scope.go:117] "RemoveContainer" containerID="a18ee326e2172a9ae38e108604c69aa83a219df0ffd11778b8d39611a3b0536c" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.410410 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ghw4h"] Oct 08 23:09:18 crc kubenswrapper[4834]: E1008 23:09:18.410952 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" containerName="extract-utilities" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.411363 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" containerName="extract-utilities" Oct 08 23:09:18 crc kubenswrapper[4834]: E1008 23:09:18.411400 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" containerName="extract-content" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.411413 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" containerName="extract-content" Oct 08 23:09:18 crc kubenswrapper[4834]: E1008 23:09:18.411436 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" containerName="registry-server" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.411448 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" containerName="registry-server" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.411825 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec4bb7e-9fa2-4fb8-9deb-427a5cab9245" containerName="registry-server" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.414131 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.431032 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghw4h"] Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.515747 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a4fe2b1-de89-4b44-979d-d53896598c75-utilities\") pod \"redhat-marketplace-ghw4h\" (UID: \"7a4fe2b1-de89-4b44-979d-d53896598c75\") " pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.515832 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a4fe2b1-de89-4b44-979d-d53896598c75-catalog-content\") pod \"redhat-marketplace-ghw4h\" (UID: \"7a4fe2b1-de89-4b44-979d-d53896598c75\") " pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.515936 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw2vc\" (UniqueName: \"kubernetes.io/projected/7a4fe2b1-de89-4b44-979d-d53896598c75-kube-api-access-dw2vc\") pod \"redhat-marketplace-ghw4h\" (UID: \"7a4fe2b1-de89-4b44-979d-d53896598c75\") " pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.618519 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a4fe2b1-de89-4b44-979d-d53896598c75-utilities\") pod \"redhat-marketplace-ghw4h\" (UID: \"7a4fe2b1-de89-4b44-979d-d53896598c75\") " pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.618655 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a4fe2b1-de89-4b44-979d-d53896598c75-catalog-content\") pod \"redhat-marketplace-ghw4h\" (UID: \"7a4fe2b1-de89-4b44-979d-d53896598c75\") " pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.618695 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw2vc\" (UniqueName: \"kubernetes.io/projected/7a4fe2b1-de89-4b44-979d-d53896598c75-kube-api-access-dw2vc\") pod \"redhat-marketplace-ghw4h\" (UID: \"7a4fe2b1-de89-4b44-979d-d53896598c75\") " pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.619996 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a4fe2b1-de89-4b44-979d-d53896598c75-utilities\") pod \"redhat-marketplace-ghw4h\" (UID: \"7a4fe2b1-de89-4b44-979d-d53896598c75\") " pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.620266 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a4fe2b1-de89-4b44-979d-d53896598c75-catalog-content\") pod \"redhat-marketplace-ghw4h\" (UID: \"7a4fe2b1-de89-4b44-979d-d53896598c75\") " pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.653646 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw2vc\" (UniqueName: \"kubernetes.io/projected/7a4fe2b1-de89-4b44-979d-d53896598c75-kube-api-access-dw2vc\") pod \"redhat-marketplace-ghw4h\" (UID: \"7a4fe2b1-de89-4b44-979d-d53896598c75\") " pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:18 crc kubenswrapper[4834]: I1008 23:09:18.747509 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:19 crc kubenswrapper[4834]: I1008 23:09:19.199968 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghw4h"] Oct 08 23:09:19 crc kubenswrapper[4834]: I1008 23:09:19.584336 4834 generic.go:334] "Generic (PLEG): container finished" podID="7a4fe2b1-de89-4b44-979d-d53896598c75" containerID="89031b5e510bb1914d334e282d5eaf97a99ea3509a11263bc688b8a607a587e5" exitCode=0 Oct 08 23:09:19 crc kubenswrapper[4834]: I1008 23:09:19.584405 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghw4h" event={"ID":"7a4fe2b1-de89-4b44-979d-d53896598c75","Type":"ContainerDied","Data":"89031b5e510bb1914d334e282d5eaf97a99ea3509a11263bc688b8a607a587e5"} Oct 08 23:09:19 crc kubenswrapper[4834]: I1008 23:09:19.584718 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghw4h" event={"ID":"7a4fe2b1-de89-4b44-979d-d53896598c75","Type":"ContainerStarted","Data":"b210cb873b424315ecfb1f32039f3c0140b828c56acad86e6c4d087ae3a39b4b"} Oct 08 23:09:21 crc kubenswrapper[4834]: I1008 23:09:21.609647 4834 generic.go:334] "Generic (PLEG): container finished" podID="7a4fe2b1-de89-4b44-979d-d53896598c75" containerID="e161132f101440508e7651c91c1819acab163300ec1192f49f2af55bc340482a" exitCode=0 Oct 08 23:09:21 crc kubenswrapper[4834]: I1008 23:09:21.609799 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghw4h" event={"ID":"7a4fe2b1-de89-4b44-979d-d53896598c75","Type":"ContainerDied","Data":"e161132f101440508e7651c91c1819acab163300ec1192f49f2af55bc340482a"} Oct 08 23:09:22 crc kubenswrapper[4834]: I1008 23:09:22.622479 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghw4h" event={"ID":"7a4fe2b1-de89-4b44-979d-d53896598c75","Type":"ContainerStarted","Data":"965d253427393ced64e20131b217b336d62e0372ba6022b05fceb63fcc05e9fd"} Oct 08 23:09:22 crc kubenswrapper[4834]: I1008 23:09:22.648727 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ghw4h" podStartSLOduration=2.053439782 podStartE2EDuration="4.648702284s" podCreationTimestamp="2025-10-08 23:09:18 +0000 UTC" firstStartedPulling="2025-10-08 23:09:19.586499767 +0000 UTC m=+2767.409384513" lastFinishedPulling="2025-10-08 23:09:22.181762239 +0000 UTC m=+2770.004647015" observedRunningTime="2025-10-08 23:09:22.640999947 +0000 UTC m=+2770.463884703" watchObservedRunningTime="2025-10-08 23:09:22.648702284 +0000 UTC m=+2770.471587070" Oct 08 23:09:28 crc kubenswrapper[4834]: I1008 23:09:28.748686 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:28 crc kubenswrapper[4834]: I1008 23:09:28.749351 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:28 crc kubenswrapper[4834]: I1008 23:09:28.826711 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:29 crc kubenswrapper[4834]: I1008 23:09:29.763052 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:29 crc kubenswrapper[4834]: I1008 23:09:29.822869 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghw4h"] Oct 08 23:09:31 crc kubenswrapper[4834]: I1008 23:09:31.719669 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ghw4h" podUID="7a4fe2b1-de89-4b44-979d-d53896598c75" containerName="registry-server" containerID="cri-o://965d253427393ced64e20131b217b336d62e0372ba6022b05fceb63fcc05e9fd" gracePeriod=2 Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.241635 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.337640 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a4fe2b1-de89-4b44-979d-d53896598c75-catalog-content\") pod \"7a4fe2b1-de89-4b44-979d-d53896598c75\" (UID: \"7a4fe2b1-de89-4b44-979d-d53896598c75\") " Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.337838 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a4fe2b1-de89-4b44-979d-d53896598c75-utilities\") pod \"7a4fe2b1-de89-4b44-979d-d53896598c75\" (UID: \"7a4fe2b1-de89-4b44-979d-d53896598c75\") " Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.337863 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw2vc\" (UniqueName: \"kubernetes.io/projected/7a4fe2b1-de89-4b44-979d-d53896598c75-kube-api-access-dw2vc\") pod \"7a4fe2b1-de89-4b44-979d-d53896598c75\" (UID: \"7a4fe2b1-de89-4b44-979d-d53896598c75\") " Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.339175 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a4fe2b1-de89-4b44-979d-d53896598c75-utilities" (OuterVolumeSpecName: "utilities") pod "7a4fe2b1-de89-4b44-979d-d53896598c75" (UID: "7a4fe2b1-de89-4b44-979d-d53896598c75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.344865 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a4fe2b1-de89-4b44-979d-d53896598c75-kube-api-access-dw2vc" (OuterVolumeSpecName: "kube-api-access-dw2vc") pod "7a4fe2b1-de89-4b44-979d-d53896598c75" (UID: "7a4fe2b1-de89-4b44-979d-d53896598c75"). InnerVolumeSpecName "kube-api-access-dw2vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.362920 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a4fe2b1-de89-4b44-979d-d53896598c75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a4fe2b1-de89-4b44-979d-d53896598c75" (UID: "7a4fe2b1-de89-4b44-979d-d53896598c75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.439818 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw2vc\" (UniqueName: \"kubernetes.io/projected/7a4fe2b1-de89-4b44-979d-d53896598c75-kube-api-access-dw2vc\") on node \"crc\" DevicePath \"\"" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.439872 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a4fe2b1-de89-4b44-979d-d53896598c75-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.439893 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a4fe2b1-de89-4b44-979d-d53896598c75-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.733105 4834 generic.go:334] "Generic (PLEG): container finished" podID="7a4fe2b1-de89-4b44-979d-d53896598c75" containerID="965d253427393ced64e20131b217b336d62e0372ba6022b05fceb63fcc05e9fd" exitCode=0 Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.733190 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghw4h" event={"ID":"7a4fe2b1-de89-4b44-979d-d53896598c75","Type":"ContainerDied","Data":"965d253427393ced64e20131b217b336d62e0372ba6022b05fceb63fcc05e9fd"} Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.733230 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghw4h" event={"ID":"7a4fe2b1-de89-4b44-979d-d53896598c75","Type":"ContainerDied","Data":"b210cb873b424315ecfb1f32039f3c0140b828c56acad86e6c4d087ae3a39b4b"} Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.733260 4834 scope.go:117] "RemoveContainer" containerID="965d253427393ced64e20131b217b336d62e0372ba6022b05fceb63fcc05e9fd" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.733428 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghw4h" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.782767 4834 scope.go:117] "RemoveContainer" containerID="e161132f101440508e7651c91c1819acab163300ec1192f49f2af55bc340482a" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.783707 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghw4h"] Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.790281 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghw4h"] Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.827596 4834 scope.go:117] "RemoveContainer" containerID="89031b5e510bb1914d334e282d5eaf97a99ea3509a11263bc688b8a607a587e5" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.846446 4834 scope.go:117] "RemoveContainer" containerID="965d253427393ced64e20131b217b336d62e0372ba6022b05fceb63fcc05e9fd" Oct 08 23:09:32 crc kubenswrapper[4834]: E1008 23:09:32.850979 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"965d253427393ced64e20131b217b336d62e0372ba6022b05fceb63fcc05e9fd\": container with ID starting with 965d253427393ced64e20131b217b336d62e0372ba6022b05fceb63fcc05e9fd not found: ID does not exist" containerID="965d253427393ced64e20131b217b336d62e0372ba6022b05fceb63fcc05e9fd" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.851032 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965d253427393ced64e20131b217b336d62e0372ba6022b05fceb63fcc05e9fd"} err="failed to get container status \"965d253427393ced64e20131b217b336d62e0372ba6022b05fceb63fcc05e9fd\": rpc error: code = NotFound desc = could not find container \"965d253427393ced64e20131b217b336d62e0372ba6022b05fceb63fcc05e9fd\": container with ID starting with 965d253427393ced64e20131b217b336d62e0372ba6022b05fceb63fcc05e9fd not found: ID does not exist" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.851065 4834 scope.go:117] "RemoveContainer" containerID="e161132f101440508e7651c91c1819acab163300ec1192f49f2af55bc340482a" Oct 08 23:09:32 crc kubenswrapper[4834]: E1008 23:09:32.851540 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e161132f101440508e7651c91c1819acab163300ec1192f49f2af55bc340482a\": container with ID starting with e161132f101440508e7651c91c1819acab163300ec1192f49f2af55bc340482a not found: ID does not exist" containerID="e161132f101440508e7651c91c1819acab163300ec1192f49f2af55bc340482a" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.851580 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e161132f101440508e7651c91c1819acab163300ec1192f49f2af55bc340482a"} err="failed to get container status \"e161132f101440508e7651c91c1819acab163300ec1192f49f2af55bc340482a\": rpc error: code = NotFound desc = could not find container \"e161132f101440508e7651c91c1819acab163300ec1192f49f2af55bc340482a\": container with ID starting with e161132f101440508e7651c91c1819acab163300ec1192f49f2af55bc340482a not found: ID does not exist" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.851607 4834 scope.go:117] "RemoveContainer" containerID="89031b5e510bb1914d334e282d5eaf97a99ea3509a11263bc688b8a607a587e5" Oct 08 23:09:32 crc kubenswrapper[4834]: E1008 23:09:32.851983 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89031b5e510bb1914d334e282d5eaf97a99ea3509a11263bc688b8a607a587e5\": container with ID starting with 89031b5e510bb1914d334e282d5eaf97a99ea3509a11263bc688b8a607a587e5 not found: ID does not exist" containerID="89031b5e510bb1914d334e282d5eaf97a99ea3509a11263bc688b8a607a587e5" Oct 08 23:09:32 crc kubenswrapper[4834]: I1008 23:09:32.852002 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89031b5e510bb1914d334e282d5eaf97a99ea3509a11263bc688b8a607a587e5"} err="failed to get container status \"89031b5e510bb1914d334e282d5eaf97a99ea3509a11263bc688b8a607a587e5\": rpc error: code = NotFound desc = could not find container \"89031b5e510bb1914d334e282d5eaf97a99ea3509a11263bc688b8a607a587e5\": container with ID starting with 89031b5e510bb1914d334e282d5eaf97a99ea3509a11263bc688b8a607a587e5 not found: ID does not exist" Oct 08 23:09:33 crc kubenswrapper[4834]: I1008 23:09:33.567346 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a4fe2b1-de89-4b44-979d-d53896598c75" path="/var/lib/kubelet/pods/7a4fe2b1-de89-4b44-979d-d53896598c75/volumes" Oct 08 23:10:56 crc kubenswrapper[4834]: I1008 23:10:56.941464 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jbv95"] Oct 08 23:10:56 crc kubenswrapper[4834]: E1008 23:10:56.942396 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a4fe2b1-de89-4b44-979d-d53896598c75" containerName="extract-content" Oct 08 23:10:56 crc kubenswrapper[4834]: I1008 23:10:56.942418 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4fe2b1-de89-4b44-979d-d53896598c75" containerName="extract-content" Oct 08 23:10:56 crc kubenswrapper[4834]: E1008 23:10:56.942455 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a4fe2b1-de89-4b44-979d-d53896598c75" containerName="registry-server" Oct 08 23:10:56 crc kubenswrapper[4834]: I1008 23:10:56.942468 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4fe2b1-de89-4b44-979d-d53896598c75" containerName="registry-server" Oct 08 23:10:56 crc kubenswrapper[4834]: E1008 23:10:56.942502 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a4fe2b1-de89-4b44-979d-d53896598c75" containerName="extract-utilities" Oct 08 23:10:56 crc kubenswrapper[4834]: I1008 23:10:56.942516 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4fe2b1-de89-4b44-979d-d53896598c75" containerName="extract-utilities" Oct 08 23:10:56 crc kubenswrapper[4834]: I1008 23:10:56.942774 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a4fe2b1-de89-4b44-979d-d53896598c75" containerName="registry-server" Oct 08 23:10:56 crc kubenswrapper[4834]: I1008 23:10:56.944914 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:10:56 crc kubenswrapper[4834]: I1008 23:10:56.962393 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jbv95"] Oct 08 23:10:57 crc kubenswrapper[4834]: I1008 23:10:57.102129 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1433ac8-0e4f-47b1-b679-91b2dc119637-utilities\") pod \"community-operators-jbv95\" (UID: \"c1433ac8-0e4f-47b1-b679-91b2dc119637\") " pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:10:57 crc kubenswrapper[4834]: I1008 23:10:57.102650 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1433ac8-0e4f-47b1-b679-91b2dc119637-catalog-content\") pod \"community-operators-jbv95\" (UID: \"c1433ac8-0e4f-47b1-b679-91b2dc119637\") " pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:10:57 crc kubenswrapper[4834]: I1008 23:10:57.102821 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg8nm\" (UniqueName: \"kubernetes.io/projected/c1433ac8-0e4f-47b1-b679-91b2dc119637-kube-api-access-fg8nm\") pod \"community-operators-jbv95\" (UID: \"c1433ac8-0e4f-47b1-b679-91b2dc119637\") " pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:10:57 crc kubenswrapper[4834]: I1008 23:10:57.204784 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1433ac8-0e4f-47b1-b679-91b2dc119637-catalog-content\") pod \"community-operators-jbv95\" (UID: \"c1433ac8-0e4f-47b1-b679-91b2dc119637\") " pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:10:57 crc kubenswrapper[4834]: I1008 23:10:57.205194 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg8nm\" (UniqueName: \"kubernetes.io/projected/c1433ac8-0e4f-47b1-b679-91b2dc119637-kube-api-access-fg8nm\") pod \"community-operators-jbv95\" (UID: \"c1433ac8-0e4f-47b1-b679-91b2dc119637\") " pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:10:57 crc kubenswrapper[4834]: I1008 23:10:57.205438 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1433ac8-0e4f-47b1-b679-91b2dc119637-utilities\") pod \"community-operators-jbv95\" (UID: \"c1433ac8-0e4f-47b1-b679-91b2dc119637\") " pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:10:57 crc kubenswrapper[4834]: I1008 23:10:57.205509 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1433ac8-0e4f-47b1-b679-91b2dc119637-catalog-content\") pod \"community-operators-jbv95\" (UID: \"c1433ac8-0e4f-47b1-b679-91b2dc119637\") " pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:10:57 crc kubenswrapper[4834]: I1008 23:10:57.206337 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1433ac8-0e4f-47b1-b679-91b2dc119637-utilities\") pod \"community-operators-jbv95\" (UID: \"c1433ac8-0e4f-47b1-b679-91b2dc119637\") " pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:10:57 crc kubenswrapper[4834]: I1008 23:10:57.241983 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg8nm\" (UniqueName: \"kubernetes.io/projected/c1433ac8-0e4f-47b1-b679-91b2dc119637-kube-api-access-fg8nm\") pod \"community-operators-jbv95\" (UID: \"c1433ac8-0e4f-47b1-b679-91b2dc119637\") " pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:10:57 crc kubenswrapper[4834]: I1008 23:10:57.288735 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:10:57 crc kubenswrapper[4834]: I1008 23:10:57.819484 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jbv95"] Oct 08 23:10:57 crc kubenswrapper[4834]: W1008 23:10:57.821453 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1433ac8_0e4f_47b1_b679_91b2dc119637.slice/crio-9e80957c4c9db76cd0c4b35fb15ca2e90038c2f3c69535d0263090daa328cc15 WatchSource:0}: Error finding container 9e80957c4c9db76cd0c4b35fb15ca2e90038c2f3c69535d0263090daa328cc15: Status 404 returned error can't find the container with id 9e80957c4c9db76cd0c4b35fb15ca2e90038c2f3c69535d0263090daa328cc15 Oct 08 23:10:58 crc kubenswrapper[4834]: I1008 23:10:58.553749 4834 generic.go:334] "Generic (PLEG): container finished" podID="c1433ac8-0e4f-47b1-b679-91b2dc119637" containerID="34a4cf1e38b13dd4ed7823330ab016f713b8c181f8c5f49a80a6e911fb69eb8d" exitCode=0 Oct 08 23:10:58 crc kubenswrapper[4834]: I1008 23:10:58.553812 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbv95" event={"ID":"c1433ac8-0e4f-47b1-b679-91b2dc119637","Type":"ContainerDied","Data":"34a4cf1e38b13dd4ed7823330ab016f713b8c181f8c5f49a80a6e911fb69eb8d"} Oct 08 23:10:58 crc kubenswrapper[4834]: I1008 23:10:58.553852 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbv95" event={"ID":"c1433ac8-0e4f-47b1-b679-91b2dc119637","Type":"ContainerStarted","Data":"9e80957c4c9db76cd0c4b35fb15ca2e90038c2f3c69535d0263090daa328cc15"} Oct 08 23:10:59 crc kubenswrapper[4834]: I1008 23:10:59.573010 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbv95" event={"ID":"c1433ac8-0e4f-47b1-b679-91b2dc119637","Type":"ContainerStarted","Data":"4e61d18a4e0e905e6b9f5b3f91b8f87044b1c00ad321c26894a8dfbf415ea181"} Oct 08 23:11:00 crc kubenswrapper[4834]: I1008 23:11:00.581782 4834 generic.go:334] "Generic (PLEG): container finished" podID="c1433ac8-0e4f-47b1-b679-91b2dc119637" containerID="4e61d18a4e0e905e6b9f5b3f91b8f87044b1c00ad321c26894a8dfbf415ea181" exitCode=0 Oct 08 23:11:00 crc kubenswrapper[4834]: I1008 23:11:00.581849 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbv95" event={"ID":"c1433ac8-0e4f-47b1-b679-91b2dc119637","Type":"ContainerDied","Data":"4e61d18a4e0e905e6b9f5b3f91b8f87044b1c00ad321c26894a8dfbf415ea181"} Oct 08 23:11:01 crc kubenswrapper[4834]: I1008 23:11:01.597546 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbv95" event={"ID":"c1433ac8-0e4f-47b1-b679-91b2dc119637","Type":"ContainerStarted","Data":"852e0c406493122ce5ef1d00c8a8c927c6fdc74d280a95fa795d12dc4fcea41e"} Oct 08 23:11:07 crc kubenswrapper[4834]: I1008 23:11:07.289405 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:11:07 crc kubenswrapper[4834]: I1008 23:11:07.290051 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:11:07 crc kubenswrapper[4834]: I1008 23:11:07.370451 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:11:07 crc kubenswrapper[4834]: I1008 23:11:07.400912 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jbv95" podStartSLOduration=8.794245547 podStartE2EDuration="11.400883204s" podCreationTimestamp="2025-10-08 23:10:56 +0000 UTC" firstStartedPulling="2025-10-08 23:10:58.555930231 +0000 UTC m=+2866.378815017" lastFinishedPulling="2025-10-08 23:11:01.162567898 +0000 UTC m=+2868.985452674" observedRunningTime="2025-10-08 23:11:01.63381577 +0000 UTC m=+2869.456700556" watchObservedRunningTime="2025-10-08 23:11:07.400883204 +0000 UTC m=+2875.223767990" Oct 08 23:11:07 crc kubenswrapper[4834]: I1008 23:11:07.730285 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:11:07 crc kubenswrapper[4834]: I1008 23:11:07.802404 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jbv95"] Oct 08 23:11:09 crc kubenswrapper[4834]: I1008 23:11:09.675971 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jbv95" podUID="c1433ac8-0e4f-47b1-b679-91b2dc119637" containerName="registry-server" containerID="cri-o://852e0c406493122ce5ef1d00c8a8c927c6fdc74d280a95fa795d12dc4fcea41e" gracePeriod=2 Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.236863 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.327534 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1433ac8-0e4f-47b1-b679-91b2dc119637-catalog-content\") pod \"c1433ac8-0e4f-47b1-b679-91b2dc119637\" (UID: \"c1433ac8-0e4f-47b1-b679-91b2dc119637\") " Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.327709 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1433ac8-0e4f-47b1-b679-91b2dc119637-utilities\") pod \"c1433ac8-0e4f-47b1-b679-91b2dc119637\" (UID: \"c1433ac8-0e4f-47b1-b679-91b2dc119637\") " Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.327761 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg8nm\" (UniqueName: \"kubernetes.io/projected/c1433ac8-0e4f-47b1-b679-91b2dc119637-kube-api-access-fg8nm\") pod \"c1433ac8-0e4f-47b1-b679-91b2dc119637\" (UID: \"c1433ac8-0e4f-47b1-b679-91b2dc119637\") " Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.329016 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1433ac8-0e4f-47b1-b679-91b2dc119637-utilities" (OuterVolumeSpecName: "utilities") pod "c1433ac8-0e4f-47b1-b679-91b2dc119637" (UID: "c1433ac8-0e4f-47b1-b679-91b2dc119637"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.337881 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1433ac8-0e4f-47b1-b679-91b2dc119637-kube-api-access-fg8nm" (OuterVolumeSpecName: "kube-api-access-fg8nm") pod "c1433ac8-0e4f-47b1-b679-91b2dc119637" (UID: "c1433ac8-0e4f-47b1-b679-91b2dc119637"). InnerVolumeSpecName "kube-api-access-fg8nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.407058 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1433ac8-0e4f-47b1-b679-91b2dc119637-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1433ac8-0e4f-47b1-b679-91b2dc119637" (UID: "c1433ac8-0e4f-47b1-b679-91b2dc119637"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.430279 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1433ac8-0e4f-47b1-b679-91b2dc119637-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.430332 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg8nm\" (UniqueName: \"kubernetes.io/projected/c1433ac8-0e4f-47b1-b679-91b2dc119637-kube-api-access-fg8nm\") on node \"crc\" DevicePath \"\"" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.430355 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1433ac8-0e4f-47b1-b679-91b2dc119637-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.689522 4834 generic.go:334] "Generic (PLEG): container finished" podID="c1433ac8-0e4f-47b1-b679-91b2dc119637" containerID="852e0c406493122ce5ef1d00c8a8c927c6fdc74d280a95fa795d12dc4fcea41e" exitCode=0 Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.689604 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbv95" event={"ID":"c1433ac8-0e4f-47b1-b679-91b2dc119637","Type":"ContainerDied","Data":"852e0c406493122ce5ef1d00c8a8c927c6fdc74d280a95fa795d12dc4fcea41e"} Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.689622 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jbv95" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.689698 4834 scope.go:117] "RemoveContainer" containerID="852e0c406493122ce5ef1d00c8a8c927c6fdc74d280a95fa795d12dc4fcea41e" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.689679 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jbv95" event={"ID":"c1433ac8-0e4f-47b1-b679-91b2dc119637","Type":"ContainerDied","Data":"9e80957c4c9db76cd0c4b35fb15ca2e90038c2f3c69535d0263090daa328cc15"} Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.721920 4834 scope.go:117] "RemoveContainer" containerID="4e61d18a4e0e905e6b9f5b3f91b8f87044b1c00ad321c26894a8dfbf415ea181" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.747123 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jbv95"] Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.756470 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jbv95"] Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.771644 4834 scope.go:117] "RemoveContainer" containerID="34a4cf1e38b13dd4ed7823330ab016f713b8c181f8c5f49a80a6e911fb69eb8d" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.807126 4834 scope.go:117] "RemoveContainer" containerID="852e0c406493122ce5ef1d00c8a8c927c6fdc74d280a95fa795d12dc4fcea41e" Oct 08 23:11:10 crc kubenswrapper[4834]: E1008 23:11:10.807847 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"852e0c406493122ce5ef1d00c8a8c927c6fdc74d280a95fa795d12dc4fcea41e\": container with ID starting with 852e0c406493122ce5ef1d00c8a8c927c6fdc74d280a95fa795d12dc4fcea41e not found: ID does not exist" containerID="852e0c406493122ce5ef1d00c8a8c927c6fdc74d280a95fa795d12dc4fcea41e" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.807951 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852e0c406493122ce5ef1d00c8a8c927c6fdc74d280a95fa795d12dc4fcea41e"} err="failed to get container status \"852e0c406493122ce5ef1d00c8a8c927c6fdc74d280a95fa795d12dc4fcea41e\": rpc error: code = NotFound desc = could not find container \"852e0c406493122ce5ef1d00c8a8c927c6fdc74d280a95fa795d12dc4fcea41e\": container with ID starting with 852e0c406493122ce5ef1d00c8a8c927c6fdc74d280a95fa795d12dc4fcea41e not found: ID does not exist" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.808007 4834 scope.go:117] "RemoveContainer" containerID="4e61d18a4e0e905e6b9f5b3f91b8f87044b1c00ad321c26894a8dfbf415ea181" Oct 08 23:11:10 crc kubenswrapper[4834]: E1008 23:11:10.808729 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e61d18a4e0e905e6b9f5b3f91b8f87044b1c00ad321c26894a8dfbf415ea181\": container with ID starting with 4e61d18a4e0e905e6b9f5b3f91b8f87044b1c00ad321c26894a8dfbf415ea181 not found: ID does not exist" containerID="4e61d18a4e0e905e6b9f5b3f91b8f87044b1c00ad321c26894a8dfbf415ea181" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.808787 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e61d18a4e0e905e6b9f5b3f91b8f87044b1c00ad321c26894a8dfbf415ea181"} err="failed to get container status \"4e61d18a4e0e905e6b9f5b3f91b8f87044b1c00ad321c26894a8dfbf415ea181\": rpc error: code = NotFound desc = could not find container \"4e61d18a4e0e905e6b9f5b3f91b8f87044b1c00ad321c26894a8dfbf415ea181\": container with ID starting with 4e61d18a4e0e905e6b9f5b3f91b8f87044b1c00ad321c26894a8dfbf415ea181 not found: ID does not exist" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.808826 4834 scope.go:117] "RemoveContainer" containerID="34a4cf1e38b13dd4ed7823330ab016f713b8c181f8c5f49a80a6e911fb69eb8d" Oct 08 23:11:10 crc kubenswrapper[4834]: E1008 23:11:10.809289 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34a4cf1e38b13dd4ed7823330ab016f713b8c181f8c5f49a80a6e911fb69eb8d\": container with ID starting with 34a4cf1e38b13dd4ed7823330ab016f713b8c181f8c5f49a80a6e911fb69eb8d not found: ID does not exist" containerID="34a4cf1e38b13dd4ed7823330ab016f713b8c181f8c5f49a80a6e911fb69eb8d" Oct 08 23:11:10 crc kubenswrapper[4834]: I1008 23:11:10.809353 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34a4cf1e38b13dd4ed7823330ab016f713b8c181f8c5f49a80a6e911fb69eb8d"} err="failed to get container status \"34a4cf1e38b13dd4ed7823330ab016f713b8c181f8c5f49a80a6e911fb69eb8d\": rpc error: code = NotFound desc = could not find container \"34a4cf1e38b13dd4ed7823330ab016f713b8c181f8c5f49a80a6e911fb69eb8d\": container with ID starting with 34a4cf1e38b13dd4ed7823330ab016f713b8c181f8c5f49a80a6e911fb69eb8d not found: ID does not exist" Oct 08 23:11:11 crc kubenswrapper[4834]: I1008 23:11:11.574848 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1433ac8-0e4f-47b1-b679-91b2dc119637" path="/var/lib/kubelet/pods/c1433ac8-0e4f-47b1-b679-91b2dc119637/volumes" Oct 08 23:11:17 crc kubenswrapper[4834]: I1008 23:11:17.025560 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:11:17 crc kubenswrapper[4834]: I1008 23:11:17.026223 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:11:47 crc kubenswrapper[4834]: I1008 23:11:47.026008 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:11:47 crc kubenswrapper[4834]: I1008 23:11:47.026924 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:12:17 crc kubenswrapper[4834]: I1008 23:12:17.026891 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:12:17 crc kubenswrapper[4834]: I1008 23:12:17.027719 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:12:17 crc kubenswrapper[4834]: I1008 23:12:17.027809 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 23:12:17 crc kubenswrapper[4834]: I1008 23:12:17.028972 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 23:12:17 crc kubenswrapper[4834]: I1008 23:12:17.029104 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" gracePeriod=600 Oct 08 23:12:17 crc kubenswrapper[4834]: E1008 23:12:17.187811 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:12:17 crc kubenswrapper[4834]: I1008 23:12:17.341260 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" exitCode=0 Oct 08 23:12:17 crc kubenswrapper[4834]: I1008 23:12:17.341312 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370"} Oct 08 23:12:17 crc kubenswrapper[4834]: I1008 23:12:17.341345 4834 scope.go:117] "RemoveContainer" containerID="67683b6ce7035acf55b3080a06c7f20aa9af2ca455ec79f64b2a695a576915ad" Oct 08 23:12:17 crc kubenswrapper[4834]: I1008 23:12:17.342130 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:12:17 crc kubenswrapper[4834]: E1008 23:12:17.342611 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:12:30 crc kubenswrapper[4834]: I1008 23:12:30.556472 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:12:30 crc kubenswrapper[4834]: E1008 23:12:30.557403 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:12:43 crc kubenswrapper[4834]: I1008 23:12:43.565081 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:12:43 crc kubenswrapper[4834]: E1008 23:12:43.566343 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:12:56 crc kubenswrapper[4834]: I1008 23:12:56.556085 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:12:56 crc kubenswrapper[4834]: E1008 23:12:56.557238 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:13:08 crc kubenswrapper[4834]: I1008 23:13:08.556893 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:13:08 crc kubenswrapper[4834]: E1008 23:13:08.558104 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:13:19 crc kubenswrapper[4834]: I1008 23:13:19.557582 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:13:19 crc kubenswrapper[4834]: E1008 23:13:19.558941 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:13:30 crc kubenswrapper[4834]: I1008 23:13:30.555279 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:13:30 crc kubenswrapper[4834]: E1008 23:13:30.557410 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:13:42 crc kubenswrapper[4834]: I1008 23:13:42.555885 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:13:42 crc kubenswrapper[4834]: E1008 23:13:42.556703 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:13:55 crc kubenswrapper[4834]: I1008 23:13:55.555353 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:13:55 crc kubenswrapper[4834]: E1008 23:13:55.558353 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:14:08 crc kubenswrapper[4834]: I1008 23:14:08.556221 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:14:08 crc kubenswrapper[4834]: E1008 23:14:08.557262 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:14:19 crc kubenswrapper[4834]: I1008 23:14:19.555851 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:14:19 crc kubenswrapper[4834]: E1008 23:14:19.556780 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:14:30 crc kubenswrapper[4834]: I1008 23:14:30.556063 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:14:30 crc kubenswrapper[4834]: E1008 23:14:30.556942 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:14:42 crc kubenswrapper[4834]: I1008 23:14:42.555458 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:14:42 crc kubenswrapper[4834]: E1008 23:14:42.556017 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:14:53 crc kubenswrapper[4834]: I1008 23:14:53.563173 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:14:53 crc kubenswrapper[4834]: E1008 23:14:53.564252 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.206552 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf"] Oct 08 23:15:00 crc kubenswrapper[4834]: E1008 23:15:00.208723 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1433ac8-0e4f-47b1-b679-91b2dc119637" containerName="extract-utilities" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.208859 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1433ac8-0e4f-47b1-b679-91b2dc119637" containerName="extract-utilities" Oct 08 23:15:00 crc kubenswrapper[4834]: E1008 23:15:00.208960 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1433ac8-0e4f-47b1-b679-91b2dc119637" containerName="registry-server" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.209041 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1433ac8-0e4f-47b1-b679-91b2dc119637" containerName="registry-server" Oct 08 23:15:00 crc kubenswrapper[4834]: E1008 23:15:00.209119 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1433ac8-0e4f-47b1-b679-91b2dc119637" containerName="extract-content" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.209229 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1433ac8-0e4f-47b1-b679-91b2dc119637" containerName="extract-content" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.209487 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1433ac8-0e4f-47b1-b679-91b2dc119637" containerName="registry-server" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.210121 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.212530 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.212763 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.231095 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf"] Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.335558 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73be29c1-9a86-4800-9c58-c1dcc6de5475-config-volume\") pod \"collect-profiles-29332755-bvxmf\" (UID: \"73be29c1-9a86-4800-9c58-c1dcc6de5475\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.335701 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62pzq\" (UniqueName: \"kubernetes.io/projected/73be29c1-9a86-4800-9c58-c1dcc6de5475-kube-api-access-62pzq\") pod \"collect-profiles-29332755-bvxmf\" (UID: \"73be29c1-9a86-4800-9c58-c1dcc6de5475\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.335807 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73be29c1-9a86-4800-9c58-c1dcc6de5475-secret-volume\") pod \"collect-profiles-29332755-bvxmf\" (UID: \"73be29c1-9a86-4800-9c58-c1dcc6de5475\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.436826 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73be29c1-9a86-4800-9c58-c1dcc6de5475-secret-volume\") pod \"collect-profiles-29332755-bvxmf\" (UID: \"73be29c1-9a86-4800-9c58-c1dcc6de5475\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.436950 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73be29c1-9a86-4800-9c58-c1dcc6de5475-config-volume\") pod \"collect-profiles-29332755-bvxmf\" (UID: \"73be29c1-9a86-4800-9c58-c1dcc6de5475\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.437005 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62pzq\" (UniqueName: \"kubernetes.io/projected/73be29c1-9a86-4800-9c58-c1dcc6de5475-kube-api-access-62pzq\") pod \"collect-profiles-29332755-bvxmf\" (UID: \"73be29c1-9a86-4800-9c58-c1dcc6de5475\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.438536 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73be29c1-9a86-4800-9c58-c1dcc6de5475-config-volume\") pod \"collect-profiles-29332755-bvxmf\" (UID: \"73be29c1-9a86-4800-9c58-c1dcc6de5475\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.449417 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73be29c1-9a86-4800-9c58-c1dcc6de5475-secret-volume\") pod \"collect-profiles-29332755-bvxmf\" (UID: \"73be29c1-9a86-4800-9c58-c1dcc6de5475\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.460299 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62pzq\" (UniqueName: \"kubernetes.io/projected/73be29c1-9a86-4800-9c58-c1dcc6de5475-kube-api-access-62pzq\") pod \"collect-profiles-29332755-bvxmf\" (UID: \"73be29c1-9a86-4800-9c58-c1dcc6de5475\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.531460 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.828128 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf"] Oct 08 23:15:00 crc kubenswrapper[4834]: I1008 23:15:00.945960 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" event={"ID":"73be29c1-9a86-4800-9c58-c1dcc6de5475","Type":"ContainerStarted","Data":"03cdeeb2e8ea2f697a5c4903bd1276c0a6e203c0def29b5a1e3e4f71a28e5686"} Oct 08 23:15:01 crc kubenswrapper[4834]: I1008 23:15:01.961477 4834 generic.go:334] "Generic (PLEG): container finished" podID="73be29c1-9a86-4800-9c58-c1dcc6de5475" containerID="eab09225d7449747241ca48fee7da55ec5743bb24cacb55716d23e9a74e3dac2" exitCode=0 Oct 08 23:15:01 crc kubenswrapper[4834]: I1008 23:15:01.961547 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" event={"ID":"73be29c1-9a86-4800-9c58-c1dcc6de5475","Type":"ContainerDied","Data":"eab09225d7449747241ca48fee7da55ec5743bb24cacb55716d23e9a74e3dac2"} Oct 08 23:15:03 crc kubenswrapper[4834]: I1008 23:15:03.317330 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" Oct 08 23:15:03 crc kubenswrapper[4834]: I1008 23:15:03.481956 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62pzq\" (UniqueName: \"kubernetes.io/projected/73be29c1-9a86-4800-9c58-c1dcc6de5475-kube-api-access-62pzq\") pod \"73be29c1-9a86-4800-9c58-c1dcc6de5475\" (UID: \"73be29c1-9a86-4800-9c58-c1dcc6de5475\") " Oct 08 23:15:03 crc kubenswrapper[4834]: I1008 23:15:03.482168 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73be29c1-9a86-4800-9c58-c1dcc6de5475-secret-volume\") pod \"73be29c1-9a86-4800-9c58-c1dcc6de5475\" (UID: \"73be29c1-9a86-4800-9c58-c1dcc6de5475\") " Oct 08 23:15:03 crc kubenswrapper[4834]: I1008 23:15:03.482209 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73be29c1-9a86-4800-9c58-c1dcc6de5475-config-volume\") pod \"73be29c1-9a86-4800-9c58-c1dcc6de5475\" (UID: \"73be29c1-9a86-4800-9c58-c1dcc6de5475\") " Oct 08 23:15:03 crc kubenswrapper[4834]: I1008 23:15:03.483131 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73be29c1-9a86-4800-9c58-c1dcc6de5475-config-volume" (OuterVolumeSpecName: "config-volume") pod "73be29c1-9a86-4800-9c58-c1dcc6de5475" (UID: "73be29c1-9a86-4800-9c58-c1dcc6de5475"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 23:15:03 crc kubenswrapper[4834]: I1008 23:15:03.487092 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73be29c1-9a86-4800-9c58-c1dcc6de5475-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "73be29c1-9a86-4800-9c58-c1dcc6de5475" (UID: "73be29c1-9a86-4800-9c58-c1dcc6de5475"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 23:15:03 crc kubenswrapper[4834]: I1008 23:15:03.495708 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73be29c1-9a86-4800-9c58-c1dcc6de5475-kube-api-access-62pzq" (OuterVolumeSpecName: "kube-api-access-62pzq") pod "73be29c1-9a86-4800-9c58-c1dcc6de5475" (UID: "73be29c1-9a86-4800-9c58-c1dcc6de5475"). InnerVolumeSpecName "kube-api-access-62pzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:15:03 crc kubenswrapper[4834]: I1008 23:15:03.584030 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73be29c1-9a86-4800-9c58-c1dcc6de5475-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 23:15:03 crc kubenswrapper[4834]: I1008 23:15:03.584716 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73be29c1-9a86-4800-9c58-c1dcc6de5475-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 23:15:03 crc kubenswrapper[4834]: I1008 23:15:03.585034 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62pzq\" (UniqueName: \"kubernetes.io/projected/73be29c1-9a86-4800-9c58-c1dcc6de5475-kube-api-access-62pzq\") on node \"crc\" DevicePath \"\"" Oct 08 23:15:03 crc kubenswrapper[4834]: I1008 23:15:03.979864 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" event={"ID":"73be29c1-9a86-4800-9c58-c1dcc6de5475","Type":"ContainerDied","Data":"03cdeeb2e8ea2f697a5c4903bd1276c0a6e203c0def29b5a1e3e4f71a28e5686"} Oct 08 23:15:03 crc kubenswrapper[4834]: I1008 23:15:03.979905 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03cdeeb2e8ea2f697a5c4903bd1276c0a6e203c0def29b5a1e3e4f71a28e5686" Oct 08 23:15:03 crc kubenswrapper[4834]: I1008 23:15:03.979945 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332755-bvxmf" Oct 08 23:15:04 crc kubenswrapper[4834]: I1008 23:15:04.425373 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn"] Oct 08 23:15:04 crc kubenswrapper[4834]: I1008 23:15:04.435120 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332710-sj7zn"] Oct 08 23:15:05 crc kubenswrapper[4834]: I1008 23:15:05.574261 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b80f2a3c-d45a-4888-9ac1-39f5ee2abea1" path="/var/lib/kubelet/pods/b80f2a3c-d45a-4888-9ac1-39f5ee2abea1/volumes" Oct 08 23:15:08 crc kubenswrapper[4834]: I1008 23:15:08.556534 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:15:08 crc kubenswrapper[4834]: E1008 23:15:08.557315 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:15:22 crc kubenswrapper[4834]: I1008 23:15:22.427826 4834 scope.go:117] "RemoveContainer" containerID="b7e3ceae00cd0572166ae276fba98515dfa3144d20968b111ea7d37a1228fb05" Oct 08 23:15:23 crc kubenswrapper[4834]: I1008 23:15:23.562354 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:15:23 crc kubenswrapper[4834]: E1008 23:15:23.563029 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:15:38 crc kubenswrapper[4834]: I1008 23:15:38.556322 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:15:38 crc kubenswrapper[4834]: E1008 23:15:38.557497 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:15:51 crc kubenswrapper[4834]: I1008 23:15:51.556380 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:15:51 crc kubenswrapper[4834]: E1008 23:15:51.557473 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:16:06 crc kubenswrapper[4834]: I1008 23:16:06.556516 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:16:06 crc kubenswrapper[4834]: E1008 23:16:06.557674 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:16:21 crc kubenswrapper[4834]: I1008 23:16:21.556453 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:16:21 crc kubenswrapper[4834]: E1008 23:16:21.557485 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:16:34 crc kubenswrapper[4834]: I1008 23:16:34.555097 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:16:34 crc kubenswrapper[4834]: E1008 23:16:34.555596 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:16:48 crc kubenswrapper[4834]: I1008 23:16:48.555423 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:16:48 crc kubenswrapper[4834]: E1008 23:16:48.556077 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:17:00 crc kubenswrapper[4834]: I1008 23:17:00.556225 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:17:00 crc kubenswrapper[4834]: E1008 23:17:00.557484 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:17:14 crc kubenswrapper[4834]: I1008 23:17:14.556514 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:17:14 crc kubenswrapper[4834]: E1008 23:17:14.557492 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:17:27 crc kubenswrapper[4834]: I1008 23:17:27.555348 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:17:28 crc kubenswrapper[4834]: I1008 23:17:28.367537 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"ec8bbd92a8a38a0a3ba2206e4c96f4d9510575445372448f780335597e1912a0"} Oct 08 23:19:01 crc kubenswrapper[4834]: I1008 23:19:01.674654 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jxrh5"] Oct 08 23:19:01 crc kubenswrapper[4834]: E1008 23:19:01.676056 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73be29c1-9a86-4800-9c58-c1dcc6de5475" containerName="collect-profiles" Oct 08 23:19:01 crc kubenswrapper[4834]: I1008 23:19:01.676088 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="73be29c1-9a86-4800-9c58-c1dcc6de5475" containerName="collect-profiles" Oct 08 23:19:01 crc kubenswrapper[4834]: I1008 23:19:01.676484 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="73be29c1-9a86-4800-9c58-c1dcc6de5475" containerName="collect-profiles" Oct 08 23:19:01 crc kubenswrapper[4834]: I1008 23:19:01.678427 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:01 crc kubenswrapper[4834]: I1008 23:19:01.685139 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jxrh5"] Oct 08 23:19:01 crc kubenswrapper[4834]: I1008 23:19:01.781672 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed3c1981-6703-4107-b53c-92e6438155db-utilities\") pod \"redhat-operators-jxrh5\" (UID: \"ed3c1981-6703-4107-b53c-92e6438155db\") " pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:01 crc kubenswrapper[4834]: I1008 23:19:01.781748 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed3c1981-6703-4107-b53c-92e6438155db-catalog-content\") pod \"redhat-operators-jxrh5\" (UID: \"ed3c1981-6703-4107-b53c-92e6438155db\") " pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:01 crc kubenswrapper[4834]: I1008 23:19:01.781797 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t6bd\" (UniqueName: \"kubernetes.io/projected/ed3c1981-6703-4107-b53c-92e6438155db-kube-api-access-2t6bd\") pod \"redhat-operators-jxrh5\" (UID: \"ed3c1981-6703-4107-b53c-92e6438155db\") " pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:01 crc kubenswrapper[4834]: I1008 23:19:01.882952 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t6bd\" (UniqueName: \"kubernetes.io/projected/ed3c1981-6703-4107-b53c-92e6438155db-kube-api-access-2t6bd\") pod \"redhat-operators-jxrh5\" (UID: \"ed3c1981-6703-4107-b53c-92e6438155db\") " pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:01 crc kubenswrapper[4834]: I1008 23:19:01.883030 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed3c1981-6703-4107-b53c-92e6438155db-utilities\") pod \"redhat-operators-jxrh5\" (UID: \"ed3c1981-6703-4107-b53c-92e6438155db\") " pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:01 crc kubenswrapper[4834]: I1008 23:19:01.883096 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed3c1981-6703-4107-b53c-92e6438155db-catalog-content\") pod \"redhat-operators-jxrh5\" (UID: \"ed3c1981-6703-4107-b53c-92e6438155db\") " pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:01 crc kubenswrapper[4834]: I1008 23:19:01.883887 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed3c1981-6703-4107-b53c-92e6438155db-catalog-content\") pod \"redhat-operators-jxrh5\" (UID: \"ed3c1981-6703-4107-b53c-92e6438155db\") " pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:01 crc kubenswrapper[4834]: I1008 23:19:01.884391 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed3c1981-6703-4107-b53c-92e6438155db-utilities\") pod \"redhat-operators-jxrh5\" (UID: \"ed3c1981-6703-4107-b53c-92e6438155db\") " pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:01 crc kubenswrapper[4834]: I1008 23:19:01.909999 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t6bd\" (UniqueName: \"kubernetes.io/projected/ed3c1981-6703-4107-b53c-92e6438155db-kube-api-access-2t6bd\") pod \"redhat-operators-jxrh5\" (UID: \"ed3c1981-6703-4107-b53c-92e6438155db\") " pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:02 crc kubenswrapper[4834]: I1008 23:19:02.013201 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:02 crc kubenswrapper[4834]: I1008 23:19:02.257631 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jxrh5"] Oct 08 23:19:02 crc kubenswrapper[4834]: I1008 23:19:02.288323 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxrh5" event={"ID":"ed3c1981-6703-4107-b53c-92e6438155db","Type":"ContainerStarted","Data":"4a3a95dfffcea4ed6c8c35870c75071cbaed89279588b074e1f93b9dcddf888c"} Oct 08 23:19:03 crc kubenswrapper[4834]: I1008 23:19:03.303006 4834 generic.go:334] "Generic (PLEG): container finished" podID="ed3c1981-6703-4107-b53c-92e6438155db" containerID="ec298253e97fb5a51a9483aff3985378dfa8b19dba97ff4cf34e010a0bde35d1" exitCode=0 Oct 08 23:19:03 crc kubenswrapper[4834]: I1008 23:19:03.303117 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxrh5" event={"ID":"ed3c1981-6703-4107-b53c-92e6438155db","Type":"ContainerDied","Data":"ec298253e97fb5a51a9483aff3985378dfa8b19dba97ff4cf34e010a0bde35d1"} Oct 08 23:19:03 crc kubenswrapper[4834]: I1008 23:19:03.307926 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 23:19:05 crc kubenswrapper[4834]: I1008 23:19:05.326978 4834 generic.go:334] "Generic (PLEG): container finished" podID="ed3c1981-6703-4107-b53c-92e6438155db" containerID="fe53d120f5bd2db71a1e503080599f756124713feef1578b388334baaa486d31" exitCode=0 Oct 08 23:19:05 crc kubenswrapper[4834]: I1008 23:19:05.327067 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxrh5" event={"ID":"ed3c1981-6703-4107-b53c-92e6438155db","Type":"ContainerDied","Data":"fe53d120f5bd2db71a1e503080599f756124713feef1578b388334baaa486d31"} Oct 08 23:19:06 crc kubenswrapper[4834]: I1008 23:19:06.340713 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxrh5" event={"ID":"ed3c1981-6703-4107-b53c-92e6438155db","Type":"ContainerStarted","Data":"774b226bfade65e251c1b8e561c5888eb3fea2d66e4eb9a3f58a130a16fa003d"} Oct 08 23:19:06 crc kubenswrapper[4834]: I1008 23:19:06.378343 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jxrh5" podStartSLOduration=2.729425229 podStartE2EDuration="5.378312831s" podCreationTimestamp="2025-10-08 23:19:01 +0000 UTC" firstStartedPulling="2025-10-08 23:19:03.306990575 +0000 UTC m=+3351.129875361" lastFinishedPulling="2025-10-08 23:19:05.955878177 +0000 UTC m=+3353.778762963" observedRunningTime="2025-10-08 23:19:06.367844775 +0000 UTC m=+3354.190729551" watchObservedRunningTime="2025-10-08 23:19:06.378312831 +0000 UTC m=+3354.201197647" Oct 08 23:19:12 crc kubenswrapper[4834]: I1008 23:19:12.013559 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:12 crc kubenswrapper[4834]: I1008 23:19:12.015610 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:13 crc kubenswrapper[4834]: I1008 23:19:13.090911 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jxrh5" podUID="ed3c1981-6703-4107-b53c-92e6438155db" containerName="registry-server" probeResult="failure" output=< Oct 08 23:19:13 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Oct 08 23:19:13 crc kubenswrapper[4834]: > Oct 08 23:19:22 crc kubenswrapper[4834]: I1008 23:19:22.089097 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:22 crc kubenswrapper[4834]: I1008 23:19:22.156776 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:22 crc kubenswrapper[4834]: I1008 23:19:22.336923 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jxrh5"] Oct 08 23:19:23 crc kubenswrapper[4834]: I1008 23:19:23.526403 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jxrh5" podUID="ed3c1981-6703-4107-b53c-92e6438155db" containerName="registry-server" containerID="cri-o://774b226bfade65e251c1b8e561c5888eb3fea2d66e4eb9a3f58a130a16fa003d" gracePeriod=2 Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.098422 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.254618 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed3c1981-6703-4107-b53c-92e6438155db-utilities\") pod \"ed3c1981-6703-4107-b53c-92e6438155db\" (UID: \"ed3c1981-6703-4107-b53c-92e6438155db\") " Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.254691 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed3c1981-6703-4107-b53c-92e6438155db-catalog-content\") pod \"ed3c1981-6703-4107-b53c-92e6438155db\" (UID: \"ed3c1981-6703-4107-b53c-92e6438155db\") " Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.254742 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t6bd\" (UniqueName: \"kubernetes.io/projected/ed3c1981-6703-4107-b53c-92e6438155db-kube-api-access-2t6bd\") pod \"ed3c1981-6703-4107-b53c-92e6438155db\" (UID: \"ed3c1981-6703-4107-b53c-92e6438155db\") " Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.256136 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed3c1981-6703-4107-b53c-92e6438155db-utilities" (OuterVolumeSpecName: "utilities") pod "ed3c1981-6703-4107-b53c-92e6438155db" (UID: "ed3c1981-6703-4107-b53c-92e6438155db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.261679 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed3c1981-6703-4107-b53c-92e6438155db-kube-api-access-2t6bd" (OuterVolumeSpecName: "kube-api-access-2t6bd") pod "ed3c1981-6703-4107-b53c-92e6438155db" (UID: "ed3c1981-6703-4107-b53c-92e6438155db"). InnerVolumeSpecName "kube-api-access-2t6bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.355451 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed3c1981-6703-4107-b53c-92e6438155db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed3c1981-6703-4107-b53c-92e6438155db" (UID: "ed3c1981-6703-4107-b53c-92e6438155db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.355818 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed3c1981-6703-4107-b53c-92e6438155db-catalog-content\") pod \"ed3c1981-6703-4107-b53c-92e6438155db\" (UID: \"ed3c1981-6703-4107-b53c-92e6438155db\") " Oct 08 23:19:24 crc kubenswrapper[4834]: W1008 23:19:24.355980 4834 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/ed3c1981-6703-4107-b53c-92e6438155db/volumes/kubernetes.io~empty-dir/catalog-content Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.355994 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed3c1981-6703-4107-b53c-92e6438155db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed3c1981-6703-4107-b53c-92e6438155db" (UID: "ed3c1981-6703-4107-b53c-92e6438155db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.356137 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed3c1981-6703-4107-b53c-92e6438155db-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.356189 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed3c1981-6703-4107-b53c-92e6438155db-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.356209 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t6bd\" (UniqueName: \"kubernetes.io/projected/ed3c1981-6703-4107-b53c-92e6438155db-kube-api-access-2t6bd\") on node \"crc\" DevicePath \"\"" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.536104 4834 generic.go:334] "Generic (PLEG): container finished" podID="ed3c1981-6703-4107-b53c-92e6438155db" containerID="774b226bfade65e251c1b8e561c5888eb3fea2d66e4eb9a3f58a130a16fa003d" exitCode=0 Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.536185 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxrh5" event={"ID":"ed3c1981-6703-4107-b53c-92e6438155db","Type":"ContainerDied","Data":"774b226bfade65e251c1b8e561c5888eb3fea2d66e4eb9a3f58a130a16fa003d"} Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.536256 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxrh5" event={"ID":"ed3c1981-6703-4107-b53c-92e6438155db","Type":"ContainerDied","Data":"4a3a95dfffcea4ed6c8c35870c75071cbaed89279588b074e1f93b9dcddf888c"} Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.536283 4834 scope.go:117] "RemoveContainer" containerID="774b226bfade65e251c1b8e561c5888eb3fea2d66e4eb9a3f58a130a16fa003d" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.536318 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxrh5" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.571438 4834 scope.go:117] "RemoveContainer" containerID="fe53d120f5bd2db71a1e503080599f756124713feef1578b388334baaa486d31" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.579277 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jxrh5"] Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.586520 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jxrh5"] Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.597386 4834 scope.go:117] "RemoveContainer" containerID="ec298253e97fb5a51a9483aff3985378dfa8b19dba97ff4cf34e010a0bde35d1" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.624431 4834 scope.go:117] "RemoveContainer" containerID="774b226bfade65e251c1b8e561c5888eb3fea2d66e4eb9a3f58a130a16fa003d" Oct 08 23:19:24 crc kubenswrapper[4834]: E1008 23:19:24.624832 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"774b226bfade65e251c1b8e561c5888eb3fea2d66e4eb9a3f58a130a16fa003d\": container with ID starting with 774b226bfade65e251c1b8e561c5888eb3fea2d66e4eb9a3f58a130a16fa003d not found: ID does not exist" containerID="774b226bfade65e251c1b8e561c5888eb3fea2d66e4eb9a3f58a130a16fa003d" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.624865 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"774b226bfade65e251c1b8e561c5888eb3fea2d66e4eb9a3f58a130a16fa003d"} err="failed to get container status \"774b226bfade65e251c1b8e561c5888eb3fea2d66e4eb9a3f58a130a16fa003d\": rpc error: code = NotFound desc = could not find container \"774b226bfade65e251c1b8e561c5888eb3fea2d66e4eb9a3f58a130a16fa003d\": container with ID starting with 774b226bfade65e251c1b8e561c5888eb3fea2d66e4eb9a3f58a130a16fa003d not found: ID does not exist" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.624888 4834 scope.go:117] "RemoveContainer" containerID="fe53d120f5bd2db71a1e503080599f756124713feef1578b388334baaa486d31" Oct 08 23:19:24 crc kubenswrapper[4834]: E1008 23:19:24.625283 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe53d120f5bd2db71a1e503080599f756124713feef1578b388334baaa486d31\": container with ID starting with fe53d120f5bd2db71a1e503080599f756124713feef1578b388334baaa486d31 not found: ID does not exist" containerID="fe53d120f5bd2db71a1e503080599f756124713feef1578b388334baaa486d31" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.625330 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe53d120f5bd2db71a1e503080599f756124713feef1578b388334baaa486d31"} err="failed to get container status \"fe53d120f5bd2db71a1e503080599f756124713feef1578b388334baaa486d31\": rpc error: code = NotFound desc = could not find container \"fe53d120f5bd2db71a1e503080599f756124713feef1578b388334baaa486d31\": container with ID starting with fe53d120f5bd2db71a1e503080599f756124713feef1578b388334baaa486d31 not found: ID does not exist" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.625372 4834 scope.go:117] "RemoveContainer" containerID="ec298253e97fb5a51a9483aff3985378dfa8b19dba97ff4cf34e010a0bde35d1" Oct 08 23:19:24 crc kubenswrapper[4834]: E1008 23:19:24.625737 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec298253e97fb5a51a9483aff3985378dfa8b19dba97ff4cf34e010a0bde35d1\": container with ID starting with ec298253e97fb5a51a9483aff3985378dfa8b19dba97ff4cf34e010a0bde35d1 not found: ID does not exist" containerID="ec298253e97fb5a51a9483aff3985378dfa8b19dba97ff4cf34e010a0bde35d1" Oct 08 23:19:24 crc kubenswrapper[4834]: I1008 23:19:24.625759 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec298253e97fb5a51a9483aff3985378dfa8b19dba97ff4cf34e010a0bde35d1"} err="failed to get container status \"ec298253e97fb5a51a9483aff3985378dfa8b19dba97ff4cf34e010a0bde35d1\": rpc error: code = NotFound desc = could not find container \"ec298253e97fb5a51a9483aff3985378dfa8b19dba97ff4cf34e010a0bde35d1\": container with ID starting with ec298253e97fb5a51a9483aff3985378dfa8b19dba97ff4cf34e010a0bde35d1 not found: ID does not exist" Oct 08 23:19:25 crc kubenswrapper[4834]: I1008 23:19:25.569262 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed3c1981-6703-4107-b53c-92e6438155db" path="/var/lib/kubelet/pods/ed3c1981-6703-4107-b53c-92e6438155db/volumes" Oct 08 23:19:47 crc kubenswrapper[4834]: I1008 23:19:47.025651 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:19:47 crc kubenswrapper[4834]: I1008 23:19:47.026406 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:20:17 crc kubenswrapper[4834]: I1008 23:20:17.025394 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:20:17 crc kubenswrapper[4834]: I1008 23:20:17.026242 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:20:47 crc kubenswrapper[4834]: I1008 23:20:47.026280 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:20:47 crc kubenswrapper[4834]: I1008 23:20:47.027596 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:20:47 crc kubenswrapper[4834]: I1008 23:20:47.027713 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 23:20:47 crc kubenswrapper[4834]: I1008 23:20:47.029469 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec8bbd92a8a38a0a3ba2206e4c96f4d9510575445372448f780335597e1912a0"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 23:20:47 crc kubenswrapper[4834]: I1008 23:20:47.029625 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://ec8bbd92a8a38a0a3ba2206e4c96f4d9510575445372448f780335597e1912a0" gracePeriod=600 Oct 08 23:20:47 crc kubenswrapper[4834]: I1008 23:20:47.360584 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="ec8bbd92a8a38a0a3ba2206e4c96f4d9510575445372448f780335597e1912a0" exitCode=0 Oct 08 23:20:47 crc kubenswrapper[4834]: I1008 23:20:47.360655 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"ec8bbd92a8a38a0a3ba2206e4c96f4d9510575445372448f780335597e1912a0"} Oct 08 23:20:47 crc kubenswrapper[4834]: I1008 23:20:47.361139 4834 scope.go:117] "RemoveContainer" containerID="83ecd6087afdb8394ed35bded7f2347a72cd2cd29c5b7d0798c2c2ec868eb370" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.130068 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6zn6j"] Oct 08 23:20:48 crc kubenswrapper[4834]: E1008 23:20:48.131853 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed3c1981-6703-4107-b53c-92e6438155db" containerName="registry-server" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.131880 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3c1981-6703-4107-b53c-92e6438155db" containerName="registry-server" Oct 08 23:20:48 crc kubenswrapper[4834]: E1008 23:20:48.131916 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed3c1981-6703-4107-b53c-92e6438155db" containerName="extract-content" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.131928 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3c1981-6703-4107-b53c-92e6438155db" containerName="extract-content" Oct 08 23:20:48 crc kubenswrapper[4834]: E1008 23:20:48.131968 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed3c1981-6703-4107-b53c-92e6438155db" containerName="extract-utilities" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.131980 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3c1981-6703-4107-b53c-92e6438155db" containerName="extract-utilities" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.132295 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed3c1981-6703-4107-b53c-92e6438155db" containerName="registry-server" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.134088 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.143378 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zn6j"] Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.208622 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5b20562-fc87-4a04-977c-624308a49849-catalog-content\") pod \"redhat-marketplace-6zn6j\" (UID: \"b5b20562-fc87-4a04-977c-624308a49849\") " pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.208678 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5b20562-fc87-4a04-977c-624308a49849-utilities\") pod \"redhat-marketplace-6zn6j\" (UID: \"b5b20562-fc87-4a04-977c-624308a49849\") " pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.208699 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tx97\" (UniqueName: \"kubernetes.io/projected/b5b20562-fc87-4a04-977c-624308a49849-kube-api-access-2tx97\") pod \"redhat-marketplace-6zn6j\" (UID: \"b5b20562-fc87-4a04-977c-624308a49849\") " pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.309856 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5b20562-fc87-4a04-977c-624308a49849-utilities\") pod \"redhat-marketplace-6zn6j\" (UID: \"b5b20562-fc87-4a04-977c-624308a49849\") " pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.310170 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tx97\" (UniqueName: \"kubernetes.io/projected/b5b20562-fc87-4a04-977c-624308a49849-kube-api-access-2tx97\") pod \"redhat-marketplace-6zn6j\" (UID: \"b5b20562-fc87-4a04-977c-624308a49849\") " pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.310391 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5b20562-fc87-4a04-977c-624308a49849-catalog-content\") pod \"redhat-marketplace-6zn6j\" (UID: \"b5b20562-fc87-4a04-977c-624308a49849\") " pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.310532 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5b20562-fc87-4a04-977c-624308a49849-utilities\") pod \"redhat-marketplace-6zn6j\" (UID: \"b5b20562-fc87-4a04-977c-624308a49849\") " pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.310808 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5b20562-fc87-4a04-977c-624308a49849-catalog-content\") pod \"redhat-marketplace-6zn6j\" (UID: \"b5b20562-fc87-4a04-977c-624308a49849\") " pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.337836 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tx97\" (UniqueName: \"kubernetes.io/projected/b5b20562-fc87-4a04-977c-624308a49849-kube-api-access-2tx97\") pod \"redhat-marketplace-6zn6j\" (UID: \"b5b20562-fc87-4a04-977c-624308a49849\") " pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.370646 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a"} Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.458932 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:48 crc kubenswrapper[4834]: I1008 23:20:48.945938 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zn6j"] Oct 08 23:20:49 crc kubenswrapper[4834]: I1008 23:20:49.381622 4834 generic.go:334] "Generic (PLEG): container finished" podID="b5b20562-fc87-4a04-977c-624308a49849" containerID="c805321fdf25d1afbbbbdceafaacf565b20cea2be88a61e7bbef94e5a1e429a7" exitCode=0 Oct 08 23:20:49 crc kubenswrapper[4834]: I1008 23:20:49.382476 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zn6j" event={"ID":"b5b20562-fc87-4a04-977c-624308a49849","Type":"ContainerDied","Data":"c805321fdf25d1afbbbbdceafaacf565b20cea2be88a61e7bbef94e5a1e429a7"} Oct 08 23:20:49 crc kubenswrapper[4834]: I1008 23:20:49.382532 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zn6j" event={"ID":"b5b20562-fc87-4a04-977c-624308a49849","Type":"ContainerStarted","Data":"5a3298460069039558afbd69de871d8cf2bfb52427d8e2617550a3ff3c5266b8"} Oct 08 23:20:51 crc kubenswrapper[4834]: I1008 23:20:51.400748 4834 generic.go:334] "Generic (PLEG): container finished" podID="b5b20562-fc87-4a04-977c-624308a49849" containerID="3decf3a472665220750bdd56440ac86b123a063da8b2df0d38d08ed638790bde" exitCode=0 Oct 08 23:20:51 crc kubenswrapper[4834]: I1008 23:20:51.400860 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zn6j" event={"ID":"b5b20562-fc87-4a04-977c-624308a49849","Type":"ContainerDied","Data":"3decf3a472665220750bdd56440ac86b123a063da8b2df0d38d08ed638790bde"} Oct 08 23:20:52 crc kubenswrapper[4834]: I1008 23:20:52.411315 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zn6j" event={"ID":"b5b20562-fc87-4a04-977c-624308a49849","Type":"ContainerStarted","Data":"59e045e4abf21ff39ad18f7a85d3e0a680cb8028c96abecad6437721eef43dda"} Oct 08 23:20:52 crc kubenswrapper[4834]: I1008 23:20:52.441301 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6zn6j" podStartSLOduration=1.993605152 podStartE2EDuration="4.441258985s" podCreationTimestamp="2025-10-08 23:20:48 +0000 UTC" firstStartedPulling="2025-10-08 23:20:49.383973021 +0000 UTC m=+3457.206857807" lastFinishedPulling="2025-10-08 23:20:51.831626844 +0000 UTC m=+3459.654511640" observedRunningTime="2025-10-08 23:20:52.43655325 +0000 UTC m=+3460.259438036" watchObservedRunningTime="2025-10-08 23:20:52.441258985 +0000 UTC m=+3460.264143731" Oct 08 23:20:54 crc kubenswrapper[4834]: I1008 23:20:54.533886 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6w8cw"] Oct 08 23:20:54 crc kubenswrapper[4834]: I1008 23:20:54.537931 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:20:54 crc kubenswrapper[4834]: I1008 23:20:54.561581 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6w8cw"] Oct 08 23:20:54 crc kubenswrapper[4834]: I1008 23:20:54.599269 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpdkw\" (UniqueName: \"kubernetes.io/projected/655d046d-13e3-4cbc-9dda-e5b629f224da-kube-api-access-dpdkw\") pod \"certified-operators-6w8cw\" (UID: \"655d046d-13e3-4cbc-9dda-e5b629f224da\") " pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:20:54 crc kubenswrapper[4834]: I1008 23:20:54.599334 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655d046d-13e3-4cbc-9dda-e5b629f224da-catalog-content\") pod \"certified-operators-6w8cw\" (UID: \"655d046d-13e3-4cbc-9dda-e5b629f224da\") " pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:20:54 crc kubenswrapper[4834]: I1008 23:20:54.599416 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655d046d-13e3-4cbc-9dda-e5b629f224da-utilities\") pod \"certified-operators-6w8cw\" (UID: \"655d046d-13e3-4cbc-9dda-e5b629f224da\") " pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:20:54 crc kubenswrapper[4834]: I1008 23:20:54.700874 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655d046d-13e3-4cbc-9dda-e5b629f224da-catalog-content\") pod \"certified-operators-6w8cw\" (UID: \"655d046d-13e3-4cbc-9dda-e5b629f224da\") " pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:20:54 crc kubenswrapper[4834]: I1008 23:20:54.700996 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655d046d-13e3-4cbc-9dda-e5b629f224da-utilities\") pod \"certified-operators-6w8cw\" (UID: \"655d046d-13e3-4cbc-9dda-e5b629f224da\") " pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:20:54 crc kubenswrapper[4834]: I1008 23:20:54.701093 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpdkw\" (UniqueName: \"kubernetes.io/projected/655d046d-13e3-4cbc-9dda-e5b629f224da-kube-api-access-dpdkw\") pod \"certified-operators-6w8cw\" (UID: \"655d046d-13e3-4cbc-9dda-e5b629f224da\") " pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:20:54 crc kubenswrapper[4834]: I1008 23:20:54.701876 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655d046d-13e3-4cbc-9dda-e5b629f224da-utilities\") pod \"certified-operators-6w8cw\" (UID: \"655d046d-13e3-4cbc-9dda-e5b629f224da\") " pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:20:54 crc kubenswrapper[4834]: I1008 23:20:54.702996 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655d046d-13e3-4cbc-9dda-e5b629f224da-catalog-content\") pod \"certified-operators-6w8cw\" (UID: \"655d046d-13e3-4cbc-9dda-e5b629f224da\") " pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:20:54 crc kubenswrapper[4834]: I1008 23:20:54.734654 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpdkw\" (UniqueName: \"kubernetes.io/projected/655d046d-13e3-4cbc-9dda-e5b629f224da-kube-api-access-dpdkw\") pod \"certified-operators-6w8cw\" (UID: \"655d046d-13e3-4cbc-9dda-e5b629f224da\") " pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:20:54 crc kubenswrapper[4834]: I1008 23:20:54.884387 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:20:55 crc kubenswrapper[4834]: I1008 23:20:55.402295 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6w8cw"] Oct 08 23:20:55 crc kubenswrapper[4834]: W1008 23:20:55.408583 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod655d046d_13e3_4cbc_9dda_e5b629f224da.slice/crio-375c2202b79504b2b1142738d174ea8344c1ab21975dd1bd05463a538091fbff WatchSource:0}: Error finding container 375c2202b79504b2b1142738d174ea8344c1ab21975dd1bd05463a538091fbff: Status 404 returned error can't find the container with id 375c2202b79504b2b1142738d174ea8344c1ab21975dd1bd05463a538091fbff Oct 08 23:20:55 crc kubenswrapper[4834]: I1008 23:20:55.441574 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6w8cw" event={"ID":"655d046d-13e3-4cbc-9dda-e5b629f224da","Type":"ContainerStarted","Data":"375c2202b79504b2b1142738d174ea8344c1ab21975dd1bd05463a538091fbff"} Oct 08 23:20:56 crc kubenswrapper[4834]: I1008 23:20:56.455239 4834 generic.go:334] "Generic (PLEG): container finished" podID="655d046d-13e3-4cbc-9dda-e5b629f224da" containerID="cb2f4f86d10835c3ee92b3c16ae87ddcca79698c397cc066a02d554eee198487" exitCode=0 Oct 08 23:20:56 crc kubenswrapper[4834]: I1008 23:20:56.455872 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6w8cw" event={"ID":"655d046d-13e3-4cbc-9dda-e5b629f224da","Type":"ContainerDied","Data":"cb2f4f86d10835c3ee92b3c16ae87ddcca79698c397cc066a02d554eee198487"} Oct 08 23:20:58 crc kubenswrapper[4834]: I1008 23:20:58.459308 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:58 crc kubenswrapper[4834]: I1008 23:20:58.459677 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:58 crc kubenswrapper[4834]: I1008 23:20:58.475952 4834 generic.go:334] "Generic (PLEG): container finished" podID="655d046d-13e3-4cbc-9dda-e5b629f224da" containerID="1e68efa80bf6c8ee516aaaf04a0c8e4662790a262a7d841158beab8b68200718" exitCode=0 Oct 08 23:20:58 crc kubenswrapper[4834]: I1008 23:20:58.475996 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6w8cw" event={"ID":"655d046d-13e3-4cbc-9dda-e5b629f224da","Type":"ContainerDied","Data":"1e68efa80bf6c8ee516aaaf04a0c8e4662790a262a7d841158beab8b68200718"} Oct 08 23:20:58 crc kubenswrapper[4834]: I1008 23:20:58.504026 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:58 crc kubenswrapper[4834]: I1008 23:20:58.542614 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:20:59 crc kubenswrapper[4834]: I1008 23:20:59.486792 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6w8cw" event={"ID":"655d046d-13e3-4cbc-9dda-e5b629f224da","Type":"ContainerStarted","Data":"f181a42ef42a7cc45d411e41c041de3b253874dc2dc9d4c58f557c438fa7a9be"} Oct 08 23:20:59 crc kubenswrapper[4834]: I1008 23:20:59.517910 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zn6j"] Oct 08 23:20:59 crc kubenswrapper[4834]: I1008 23:20:59.532230 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6w8cw" podStartSLOduration=2.957026334 podStartE2EDuration="5.532205068s" podCreationTimestamp="2025-10-08 23:20:54 +0000 UTC" firstStartedPulling="2025-10-08 23:20:56.458681978 +0000 UTC m=+3464.281566764" lastFinishedPulling="2025-10-08 23:20:59.033860752 +0000 UTC m=+3466.856745498" observedRunningTime="2025-10-08 23:20:59.522721806 +0000 UTC m=+3467.345606562" watchObservedRunningTime="2025-10-08 23:20:59.532205068 +0000 UTC m=+3467.355089854" Oct 08 23:21:00 crc kubenswrapper[4834]: I1008 23:21:00.495418 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6zn6j" podUID="b5b20562-fc87-4a04-977c-624308a49849" containerName="registry-server" containerID="cri-o://59e045e4abf21ff39ad18f7a85d3e0a680cb8028c96abecad6437721eef43dda" gracePeriod=2 Oct 08 23:21:00 crc kubenswrapper[4834]: I1008 23:21:00.972092 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.015779 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5b20562-fc87-4a04-977c-624308a49849-catalog-content\") pod \"b5b20562-fc87-4a04-977c-624308a49849\" (UID: \"b5b20562-fc87-4a04-977c-624308a49849\") " Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.015885 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5b20562-fc87-4a04-977c-624308a49849-utilities\") pod \"b5b20562-fc87-4a04-977c-624308a49849\" (UID: \"b5b20562-fc87-4a04-977c-624308a49849\") " Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.015910 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tx97\" (UniqueName: \"kubernetes.io/projected/b5b20562-fc87-4a04-977c-624308a49849-kube-api-access-2tx97\") pod \"b5b20562-fc87-4a04-977c-624308a49849\" (UID: \"b5b20562-fc87-4a04-977c-624308a49849\") " Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.017694 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b20562-fc87-4a04-977c-624308a49849-utilities" (OuterVolumeSpecName: "utilities") pod "b5b20562-fc87-4a04-977c-624308a49849" (UID: "b5b20562-fc87-4a04-977c-624308a49849"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.022557 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b20562-fc87-4a04-977c-624308a49849-kube-api-access-2tx97" (OuterVolumeSpecName: "kube-api-access-2tx97") pod "b5b20562-fc87-4a04-977c-624308a49849" (UID: "b5b20562-fc87-4a04-977c-624308a49849"). InnerVolumeSpecName "kube-api-access-2tx97". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.045195 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b20562-fc87-4a04-977c-624308a49849-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5b20562-fc87-4a04-977c-624308a49849" (UID: "b5b20562-fc87-4a04-977c-624308a49849"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.118062 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5b20562-fc87-4a04-977c-624308a49849-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.118107 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tx97\" (UniqueName: \"kubernetes.io/projected/b5b20562-fc87-4a04-977c-624308a49849-kube-api-access-2tx97\") on node \"crc\" DevicePath \"\"" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.118128 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5b20562-fc87-4a04-977c-624308a49849-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.506527 4834 generic.go:334] "Generic (PLEG): container finished" podID="b5b20562-fc87-4a04-977c-624308a49849" containerID="59e045e4abf21ff39ad18f7a85d3e0a680cb8028c96abecad6437721eef43dda" exitCode=0 Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.506601 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zn6j" event={"ID":"b5b20562-fc87-4a04-977c-624308a49849","Type":"ContainerDied","Data":"59e045e4abf21ff39ad18f7a85d3e0a680cb8028c96abecad6437721eef43dda"} Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.506640 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zn6j" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.506676 4834 scope.go:117] "RemoveContainer" containerID="59e045e4abf21ff39ad18f7a85d3e0a680cb8028c96abecad6437721eef43dda" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.506657 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zn6j" event={"ID":"b5b20562-fc87-4a04-977c-624308a49849","Type":"ContainerDied","Data":"5a3298460069039558afbd69de871d8cf2bfb52427d8e2617550a3ff3c5266b8"} Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.537993 4834 scope.go:117] "RemoveContainer" containerID="3decf3a472665220750bdd56440ac86b123a063da8b2df0d38d08ed638790bde" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.553107 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zn6j"] Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.565443 4834 scope.go:117] "RemoveContainer" containerID="c805321fdf25d1afbbbbdceafaacf565b20cea2be88a61e7bbef94e5a1e429a7" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.573642 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zn6j"] Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.601476 4834 scope.go:117] "RemoveContainer" containerID="59e045e4abf21ff39ad18f7a85d3e0a680cb8028c96abecad6437721eef43dda" Oct 08 23:21:01 crc kubenswrapper[4834]: E1008 23:21:01.602003 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59e045e4abf21ff39ad18f7a85d3e0a680cb8028c96abecad6437721eef43dda\": container with ID starting with 59e045e4abf21ff39ad18f7a85d3e0a680cb8028c96abecad6437721eef43dda not found: ID does not exist" containerID="59e045e4abf21ff39ad18f7a85d3e0a680cb8028c96abecad6437721eef43dda" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.602045 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59e045e4abf21ff39ad18f7a85d3e0a680cb8028c96abecad6437721eef43dda"} err="failed to get container status \"59e045e4abf21ff39ad18f7a85d3e0a680cb8028c96abecad6437721eef43dda\": rpc error: code = NotFound desc = could not find container \"59e045e4abf21ff39ad18f7a85d3e0a680cb8028c96abecad6437721eef43dda\": container with ID starting with 59e045e4abf21ff39ad18f7a85d3e0a680cb8028c96abecad6437721eef43dda not found: ID does not exist" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.602072 4834 scope.go:117] "RemoveContainer" containerID="3decf3a472665220750bdd56440ac86b123a063da8b2df0d38d08ed638790bde" Oct 08 23:21:01 crc kubenswrapper[4834]: E1008 23:21:01.602492 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3decf3a472665220750bdd56440ac86b123a063da8b2df0d38d08ed638790bde\": container with ID starting with 3decf3a472665220750bdd56440ac86b123a063da8b2df0d38d08ed638790bde not found: ID does not exist" containerID="3decf3a472665220750bdd56440ac86b123a063da8b2df0d38d08ed638790bde" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.602520 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3decf3a472665220750bdd56440ac86b123a063da8b2df0d38d08ed638790bde"} err="failed to get container status \"3decf3a472665220750bdd56440ac86b123a063da8b2df0d38d08ed638790bde\": rpc error: code = NotFound desc = could not find container \"3decf3a472665220750bdd56440ac86b123a063da8b2df0d38d08ed638790bde\": container with ID starting with 3decf3a472665220750bdd56440ac86b123a063da8b2df0d38d08ed638790bde not found: ID does not exist" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.602534 4834 scope.go:117] "RemoveContainer" containerID="c805321fdf25d1afbbbbdceafaacf565b20cea2be88a61e7bbef94e5a1e429a7" Oct 08 23:21:01 crc kubenswrapper[4834]: E1008 23:21:01.602777 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c805321fdf25d1afbbbbdceafaacf565b20cea2be88a61e7bbef94e5a1e429a7\": container with ID starting with c805321fdf25d1afbbbbdceafaacf565b20cea2be88a61e7bbef94e5a1e429a7 not found: ID does not exist" containerID="c805321fdf25d1afbbbbdceafaacf565b20cea2be88a61e7bbef94e5a1e429a7" Oct 08 23:21:01 crc kubenswrapper[4834]: I1008 23:21:01.602808 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c805321fdf25d1afbbbbdceafaacf565b20cea2be88a61e7bbef94e5a1e429a7"} err="failed to get container status \"c805321fdf25d1afbbbbdceafaacf565b20cea2be88a61e7bbef94e5a1e429a7\": rpc error: code = NotFound desc = could not find container \"c805321fdf25d1afbbbbdceafaacf565b20cea2be88a61e7bbef94e5a1e429a7\": container with ID starting with c805321fdf25d1afbbbbdceafaacf565b20cea2be88a61e7bbef94e5a1e429a7 not found: ID does not exist" Oct 08 23:21:03 crc kubenswrapper[4834]: I1008 23:21:03.571915 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b20562-fc87-4a04-977c-624308a49849" path="/var/lib/kubelet/pods/b5b20562-fc87-4a04-977c-624308a49849/volumes" Oct 08 23:21:04 crc kubenswrapper[4834]: I1008 23:21:04.884875 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:21:04 crc kubenswrapper[4834]: I1008 23:21:04.884943 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:21:04 crc kubenswrapper[4834]: I1008 23:21:04.969140 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:21:05 crc kubenswrapper[4834]: I1008 23:21:05.608589 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:21:06 crc kubenswrapper[4834]: I1008 23:21:06.516189 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6w8cw"] Oct 08 23:21:07 crc kubenswrapper[4834]: I1008 23:21:07.571772 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6w8cw" podUID="655d046d-13e3-4cbc-9dda-e5b629f224da" containerName="registry-server" containerID="cri-o://f181a42ef42a7cc45d411e41c041de3b253874dc2dc9d4c58f557c438fa7a9be" gracePeriod=2 Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.042762 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.126924 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpdkw\" (UniqueName: \"kubernetes.io/projected/655d046d-13e3-4cbc-9dda-e5b629f224da-kube-api-access-dpdkw\") pod \"655d046d-13e3-4cbc-9dda-e5b629f224da\" (UID: \"655d046d-13e3-4cbc-9dda-e5b629f224da\") " Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.127008 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655d046d-13e3-4cbc-9dda-e5b629f224da-catalog-content\") pod \"655d046d-13e3-4cbc-9dda-e5b629f224da\" (UID: \"655d046d-13e3-4cbc-9dda-e5b629f224da\") " Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.127110 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655d046d-13e3-4cbc-9dda-e5b629f224da-utilities\") pod \"655d046d-13e3-4cbc-9dda-e5b629f224da\" (UID: \"655d046d-13e3-4cbc-9dda-e5b629f224da\") " Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.128383 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/655d046d-13e3-4cbc-9dda-e5b629f224da-utilities" (OuterVolumeSpecName: "utilities") pod "655d046d-13e3-4cbc-9dda-e5b629f224da" (UID: "655d046d-13e3-4cbc-9dda-e5b629f224da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.133213 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/655d046d-13e3-4cbc-9dda-e5b629f224da-kube-api-access-dpdkw" (OuterVolumeSpecName: "kube-api-access-dpdkw") pod "655d046d-13e3-4cbc-9dda-e5b629f224da" (UID: "655d046d-13e3-4cbc-9dda-e5b629f224da"). InnerVolumeSpecName "kube-api-access-dpdkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.202611 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/655d046d-13e3-4cbc-9dda-e5b629f224da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "655d046d-13e3-4cbc-9dda-e5b629f224da" (UID: "655d046d-13e3-4cbc-9dda-e5b629f224da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.229314 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655d046d-13e3-4cbc-9dda-e5b629f224da-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.229366 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpdkw\" (UniqueName: \"kubernetes.io/projected/655d046d-13e3-4cbc-9dda-e5b629f224da-kube-api-access-dpdkw\") on node \"crc\" DevicePath \"\"" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.229387 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655d046d-13e3-4cbc-9dda-e5b629f224da-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.586089 4834 generic.go:334] "Generic (PLEG): container finished" podID="655d046d-13e3-4cbc-9dda-e5b629f224da" containerID="f181a42ef42a7cc45d411e41c041de3b253874dc2dc9d4c58f557c438fa7a9be" exitCode=0 Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.586183 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6w8cw" event={"ID":"655d046d-13e3-4cbc-9dda-e5b629f224da","Type":"ContainerDied","Data":"f181a42ef42a7cc45d411e41c041de3b253874dc2dc9d4c58f557c438fa7a9be"} Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.586246 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6w8cw" event={"ID":"655d046d-13e3-4cbc-9dda-e5b629f224da","Type":"ContainerDied","Data":"375c2202b79504b2b1142738d174ea8344c1ab21975dd1bd05463a538091fbff"} Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.586278 4834 scope.go:117] "RemoveContainer" containerID="f181a42ef42a7cc45d411e41c041de3b253874dc2dc9d4c58f557c438fa7a9be" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.587405 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6w8cw" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.618896 4834 scope.go:117] "RemoveContainer" containerID="1e68efa80bf6c8ee516aaaf04a0c8e4662790a262a7d841158beab8b68200718" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.647687 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6w8cw"] Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.658402 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6w8cw"] Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.668795 4834 scope.go:117] "RemoveContainer" containerID="cb2f4f86d10835c3ee92b3c16ae87ddcca79698c397cc066a02d554eee198487" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.699253 4834 scope.go:117] "RemoveContainer" containerID="f181a42ef42a7cc45d411e41c041de3b253874dc2dc9d4c58f557c438fa7a9be" Oct 08 23:21:08 crc kubenswrapper[4834]: E1008 23:21:08.699996 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f181a42ef42a7cc45d411e41c041de3b253874dc2dc9d4c58f557c438fa7a9be\": container with ID starting with f181a42ef42a7cc45d411e41c041de3b253874dc2dc9d4c58f557c438fa7a9be not found: ID does not exist" containerID="f181a42ef42a7cc45d411e41c041de3b253874dc2dc9d4c58f557c438fa7a9be" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.700092 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f181a42ef42a7cc45d411e41c041de3b253874dc2dc9d4c58f557c438fa7a9be"} err="failed to get container status \"f181a42ef42a7cc45d411e41c041de3b253874dc2dc9d4c58f557c438fa7a9be\": rpc error: code = NotFound desc = could not find container \"f181a42ef42a7cc45d411e41c041de3b253874dc2dc9d4c58f557c438fa7a9be\": container with ID starting with f181a42ef42a7cc45d411e41c041de3b253874dc2dc9d4c58f557c438fa7a9be not found: ID does not exist" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.700131 4834 scope.go:117] "RemoveContainer" containerID="1e68efa80bf6c8ee516aaaf04a0c8e4662790a262a7d841158beab8b68200718" Oct 08 23:21:08 crc kubenswrapper[4834]: E1008 23:21:08.700917 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e68efa80bf6c8ee516aaaf04a0c8e4662790a262a7d841158beab8b68200718\": container with ID starting with 1e68efa80bf6c8ee516aaaf04a0c8e4662790a262a7d841158beab8b68200718 not found: ID does not exist" containerID="1e68efa80bf6c8ee516aaaf04a0c8e4662790a262a7d841158beab8b68200718" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.700985 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e68efa80bf6c8ee516aaaf04a0c8e4662790a262a7d841158beab8b68200718"} err="failed to get container status \"1e68efa80bf6c8ee516aaaf04a0c8e4662790a262a7d841158beab8b68200718\": rpc error: code = NotFound desc = could not find container \"1e68efa80bf6c8ee516aaaf04a0c8e4662790a262a7d841158beab8b68200718\": container with ID starting with 1e68efa80bf6c8ee516aaaf04a0c8e4662790a262a7d841158beab8b68200718 not found: ID does not exist" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.701028 4834 scope.go:117] "RemoveContainer" containerID="cb2f4f86d10835c3ee92b3c16ae87ddcca79698c397cc066a02d554eee198487" Oct 08 23:21:08 crc kubenswrapper[4834]: E1008 23:21:08.701791 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb2f4f86d10835c3ee92b3c16ae87ddcca79698c397cc066a02d554eee198487\": container with ID starting with cb2f4f86d10835c3ee92b3c16ae87ddcca79698c397cc066a02d554eee198487 not found: ID does not exist" containerID="cb2f4f86d10835c3ee92b3c16ae87ddcca79698c397cc066a02d554eee198487" Oct 08 23:21:08 crc kubenswrapper[4834]: I1008 23:21:08.701894 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb2f4f86d10835c3ee92b3c16ae87ddcca79698c397cc066a02d554eee198487"} err="failed to get container status \"cb2f4f86d10835c3ee92b3c16ae87ddcca79698c397cc066a02d554eee198487\": rpc error: code = NotFound desc = could not find container \"cb2f4f86d10835c3ee92b3c16ae87ddcca79698c397cc066a02d554eee198487\": container with ID starting with cb2f4f86d10835c3ee92b3c16ae87ddcca79698c397cc066a02d554eee198487 not found: ID does not exist" Oct 08 23:21:09 crc kubenswrapper[4834]: I1008 23:21:09.580925 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="655d046d-13e3-4cbc-9dda-e5b629f224da" path="/var/lib/kubelet/pods/655d046d-13e3-4cbc-9dda-e5b629f224da/volumes" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.707478 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zql4r"] Oct 08 23:21:12 crc kubenswrapper[4834]: E1008 23:21:12.712379 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655d046d-13e3-4cbc-9dda-e5b629f224da" containerName="extract-content" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.712426 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="655d046d-13e3-4cbc-9dda-e5b629f224da" containerName="extract-content" Oct 08 23:21:12 crc kubenswrapper[4834]: E1008 23:21:12.712467 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b20562-fc87-4a04-977c-624308a49849" containerName="extract-content" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.712481 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b20562-fc87-4a04-977c-624308a49849" containerName="extract-content" Oct 08 23:21:12 crc kubenswrapper[4834]: E1008 23:21:12.712505 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b20562-fc87-4a04-977c-624308a49849" containerName="extract-utilities" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.712519 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b20562-fc87-4a04-977c-624308a49849" containerName="extract-utilities" Oct 08 23:21:12 crc kubenswrapper[4834]: E1008 23:21:12.712537 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655d046d-13e3-4cbc-9dda-e5b629f224da" containerName="registry-server" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.712549 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="655d046d-13e3-4cbc-9dda-e5b629f224da" containerName="registry-server" Oct 08 23:21:12 crc kubenswrapper[4834]: E1008 23:21:12.712571 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b20562-fc87-4a04-977c-624308a49849" containerName="registry-server" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.712584 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b20562-fc87-4a04-977c-624308a49849" containerName="registry-server" Oct 08 23:21:12 crc kubenswrapper[4834]: E1008 23:21:12.712610 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655d046d-13e3-4cbc-9dda-e5b629f224da" containerName="extract-utilities" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.712624 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="655d046d-13e3-4cbc-9dda-e5b629f224da" containerName="extract-utilities" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.712965 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5b20562-fc87-4a04-977c-624308a49849" containerName="registry-server" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.712988 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="655d046d-13e3-4cbc-9dda-e5b629f224da" containerName="registry-server" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.714813 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.716920 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zql4r"] Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.842053 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8717a475-3cbf-404d-a93b-9308baa16f05-catalog-content\") pod \"community-operators-zql4r\" (UID: \"8717a475-3cbf-404d-a93b-9308baa16f05\") " pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.842203 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpqgw\" (UniqueName: \"kubernetes.io/projected/8717a475-3cbf-404d-a93b-9308baa16f05-kube-api-access-tpqgw\") pod \"community-operators-zql4r\" (UID: \"8717a475-3cbf-404d-a93b-9308baa16f05\") " pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.842293 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8717a475-3cbf-404d-a93b-9308baa16f05-utilities\") pod \"community-operators-zql4r\" (UID: \"8717a475-3cbf-404d-a93b-9308baa16f05\") " pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.943476 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8717a475-3cbf-404d-a93b-9308baa16f05-catalog-content\") pod \"community-operators-zql4r\" (UID: \"8717a475-3cbf-404d-a93b-9308baa16f05\") " pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.943559 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpqgw\" (UniqueName: \"kubernetes.io/projected/8717a475-3cbf-404d-a93b-9308baa16f05-kube-api-access-tpqgw\") pod \"community-operators-zql4r\" (UID: \"8717a475-3cbf-404d-a93b-9308baa16f05\") " pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.943614 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8717a475-3cbf-404d-a93b-9308baa16f05-utilities\") pod \"community-operators-zql4r\" (UID: \"8717a475-3cbf-404d-a93b-9308baa16f05\") " pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.944125 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8717a475-3cbf-404d-a93b-9308baa16f05-catalog-content\") pod \"community-operators-zql4r\" (UID: \"8717a475-3cbf-404d-a93b-9308baa16f05\") " pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.944238 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8717a475-3cbf-404d-a93b-9308baa16f05-utilities\") pod \"community-operators-zql4r\" (UID: \"8717a475-3cbf-404d-a93b-9308baa16f05\") " pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:12 crc kubenswrapper[4834]: I1008 23:21:12.974783 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpqgw\" (UniqueName: \"kubernetes.io/projected/8717a475-3cbf-404d-a93b-9308baa16f05-kube-api-access-tpqgw\") pod \"community-operators-zql4r\" (UID: \"8717a475-3cbf-404d-a93b-9308baa16f05\") " pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:13 crc kubenswrapper[4834]: I1008 23:21:13.056206 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:13 crc kubenswrapper[4834]: I1008 23:21:13.345758 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zql4r"] Oct 08 23:21:13 crc kubenswrapper[4834]: I1008 23:21:13.652423 4834 generic.go:334] "Generic (PLEG): container finished" podID="8717a475-3cbf-404d-a93b-9308baa16f05" containerID="03982a11c54234a4754d3f49587915693917dc69caff60239ffc04267f6450f1" exitCode=0 Oct 08 23:21:13 crc kubenswrapper[4834]: I1008 23:21:13.652544 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zql4r" event={"ID":"8717a475-3cbf-404d-a93b-9308baa16f05","Type":"ContainerDied","Data":"03982a11c54234a4754d3f49587915693917dc69caff60239ffc04267f6450f1"} Oct 08 23:21:13 crc kubenswrapper[4834]: I1008 23:21:13.652905 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zql4r" event={"ID":"8717a475-3cbf-404d-a93b-9308baa16f05","Type":"ContainerStarted","Data":"705f343fb4bf63243b3a93e83f6fe96becba16b1864eff4e45786295ab33c594"} Oct 08 23:21:15 crc kubenswrapper[4834]: I1008 23:21:15.670100 4834 generic.go:334] "Generic (PLEG): container finished" podID="8717a475-3cbf-404d-a93b-9308baa16f05" containerID="e64cac0f2cf77d86280b70aa18eb179151048128e27f332f61cd0c429f29a4c7" exitCode=0 Oct 08 23:21:15 crc kubenswrapper[4834]: I1008 23:21:15.670227 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zql4r" event={"ID":"8717a475-3cbf-404d-a93b-9308baa16f05","Type":"ContainerDied","Data":"e64cac0f2cf77d86280b70aa18eb179151048128e27f332f61cd0c429f29a4c7"} Oct 08 23:21:16 crc kubenswrapper[4834]: I1008 23:21:16.681412 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zql4r" event={"ID":"8717a475-3cbf-404d-a93b-9308baa16f05","Type":"ContainerStarted","Data":"2e9d2b3611e260d7fce6ac0899dbea2758b23788a20d0eb025a1ef338271f4cc"} Oct 08 23:21:16 crc kubenswrapper[4834]: I1008 23:21:16.705984 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zql4r" podStartSLOduration=2.263566517 podStartE2EDuration="4.705968302s" podCreationTimestamp="2025-10-08 23:21:12 +0000 UTC" firstStartedPulling="2025-10-08 23:21:13.654943021 +0000 UTC m=+3481.477827767" lastFinishedPulling="2025-10-08 23:21:16.097344806 +0000 UTC m=+3483.920229552" observedRunningTime="2025-10-08 23:21:16.704434694 +0000 UTC m=+3484.527319440" watchObservedRunningTime="2025-10-08 23:21:16.705968302 +0000 UTC m=+3484.528853048" Oct 08 23:21:23 crc kubenswrapper[4834]: I1008 23:21:23.058417 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:23 crc kubenswrapper[4834]: I1008 23:21:23.059564 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:23 crc kubenswrapper[4834]: I1008 23:21:23.130952 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:23 crc kubenswrapper[4834]: I1008 23:21:23.812698 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:23 crc kubenswrapper[4834]: I1008 23:21:23.884077 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zql4r"] Oct 08 23:21:25 crc kubenswrapper[4834]: I1008 23:21:25.767099 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zql4r" podUID="8717a475-3cbf-404d-a93b-9308baa16f05" containerName="registry-server" containerID="cri-o://2e9d2b3611e260d7fce6ac0899dbea2758b23788a20d0eb025a1ef338271f4cc" gracePeriod=2 Oct 08 23:21:26 crc kubenswrapper[4834]: I1008 23:21:26.782654 4834 generic.go:334] "Generic (PLEG): container finished" podID="8717a475-3cbf-404d-a93b-9308baa16f05" containerID="2e9d2b3611e260d7fce6ac0899dbea2758b23788a20d0eb025a1ef338271f4cc" exitCode=0 Oct 08 23:21:26 crc kubenswrapper[4834]: I1008 23:21:26.782731 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zql4r" event={"ID":"8717a475-3cbf-404d-a93b-9308baa16f05","Type":"ContainerDied","Data":"2e9d2b3611e260d7fce6ac0899dbea2758b23788a20d0eb025a1ef338271f4cc"} Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.405140 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.478706 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpqgw\" (UniqueName: \"kubernetes.io/projected/8717a475-3cbf-404d-a93b-9308baa16f05-kube-api-access-tpqgw\") pod \"8717a475-3cbf-404d-a93b-9308baa16f05\" (UID: \"8717a475-3cbf-404d-a93b-9308baa16f05\") " Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.478861 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8717a475-3cbf-404d-a93b-9308baa16f05-utilities\") pod \"8717a475-3cbf-404d-a93b-9308baa16f05\" (UID: \"8717a475-3cbf-404d-a93b-9308baa16f05\") " Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.478935 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8717a475-3cbf-404d-a93b-9308baa16f05-catalog-content\") pod \"8717a475-3cbf-404d-a93b-9308baa16f05\" (UID: \"8717a475-3cbf-404d-a93b-9308baa16f05\") " Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.480680 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8717a475-3cbf-404d-a93b-9308baa16f05-utilities" (OuterVolumeSpecName: "utilities") pod "8717a475-3cbf-404d-a93b-9308baa16f05" (UID: "8717a475-3cbf-404d-a93b-9308baa16f05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.488779 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8717a475-3cbf-404d-a93b-9308baa16f05-kube-api-access-tpqgw" (OuterVolumeSpecName: "kube-api-access-tpqgw") pod "8717a475-3cbf-404d-a93b-9308baa16f05" (UID: "8717a475-3cbf-404d-a93b-9308baa16f05"). InnerVolumeSpecName "kube-api-access-tpqgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.560549 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8717a475-3cbf-404d-a93b-9308baa16f05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8717a475-3cbf-404d-a93b-9308baa16f05" (UID: "8717a475-3cbf-404d-a93b-9308baa16f05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.581382 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpqgw\" (UniqueName: \"kubernetes.io/projected/8717a475-3cbf-404d-a93b-9308baa16f05-kube-api-access-tpqgw\") on node \"crc\" DevicePath \"\"" Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.581437 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8717a475-3cbf-404d-a93b-9308baa16f05-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.581450 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8717a475-3cbf-404d-a93b-9308baa16f05-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.801404 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zql4r" event={"ID":"8717a475-3cbf-404d-a93b-9308baa16f05","Type":"ContainerDied","Data":"705f343fb4bf63243b3a93e83f6fe96becba16b1864eff4e45786295ab33c594"} Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.801480 4834 scope.go:117] "RemoveContainer" containerID="2e9d2b3611e260d7fce6ac0899dbea2758b23788a20d0eb025a1ef338271f4cc" Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.801650 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zql4r" Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.837360 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zql4r"] Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.843440 4834 scope.go:117] "RemoveContainer" containerID="e64cac0f2cf77d86280b70aa18eb179151048128e27f332f61cd0c429f29a4c7" Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.846838 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zql4r"] Oct 08 23:21:27 crc kubenswrapper[4834]: I1008 23:21:27.880290 4834 scope.go:117] "RemoveContainer" containerID="03982a11c54234a4754d3f49587915693917dc69caff60239ffc04267f6450f1" Oct 08 23:21:29 crc kubenswrapper[4834]: I1008 23:21:29.572670 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8717a475-3cbf-404d-a93b-9308baa16f05" path="/var/lib/kubelet/pods/8717a475-3cbf-404d-a93b-9308baa16f05/volumes" Oct 08 23:22:47 crc kubenswrapper[4834]: I1008 23:22:47.025655 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:22:47 crc kubenswrapper[4834]: I1008 23:22:47.026569 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:23:17 crc kubenswrapper[4834]: I1008 23:23:17.025402 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:23:17 crc kubenswrapper[4834]: I1008 23:23:17.026284 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:23:47 crc kubenswrapper[4834]: I1008 23:23:47.025319 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:23:47 crc kubenswrapper[4834]: I1008 23:23:47.026201 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:23:47 crc kubenswrapper[4834]: I1008 23:23:47.026293 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 23:23:47 crc kubenswrapper[4834]: I1008 23:23:47.027414 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 23:23:47 crc kubenswrapper[4834]: I1008 23:23:47.027520 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" gracePeriod=600 Oct 08 23:23:47 crc kubenswrapper[4834]: E1008 23:23:47.160217 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:23:48 crc kubenswrapper[4834]: I1008 23:23:48.163486 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" exitCode=0 Oct 08 23:23:48 crc kubenswrapper[4834]: I1008 23:23:48.163552 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a"} Oct 08 23:23:48 crc kubenswrapper[4834]: I1008 23:23:48.163676 4834 scope.go:117] "RemoveContainer" containerID="ec8bbd92a8a38a0a3ba2206e4c96f4d9510575445372448f780335597e1912a0" Oct 08 23:23:48 crc kubenswrapper[4834]: I1008 23:23:48.164237 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:23:48 crc kubenswrapper[4834]: E1008 23:23:48.164607 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:23:59 crc kubenswrapper[4834]: I1008 23:23:59.556586 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:23:59 crc kubenswrapper[4834]: E1008 23:23:59.557670 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:24:10 crc kubenswrapper[4834]: I1008 23:24:10.555275 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:24:10 crc kubenswrapper[4834]: E1008 23:24:10.556114 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:24:22 crc kubenswrapper[4834]: I1008 23:24:22.555075 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:24:22 crc kubenswrapper[4834]: E1008 23:24:22.556282 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:24:33 crc kubenswrapper[4834]: I1008 23:24:33.562932 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:24:33 crc kubenswrapper[4834]: E1008 23:24:33.565609 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:24:45 crc kubenswrapper[4834]: I1008 23:24:45.558607 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:24:45 crc kubenswrapper[4834]: E1008 23:24:45.560734 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:24:58 crc kubenswrapper[4834]: I1008 23:24:58.555794 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:24:58 crc kubenswrapper[4834]: E1008 23:24:58.557102 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:25:12 crc kubenswrapper[4834]: I1008 23:25:12.555553 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:25:12 crc kubenswrapper[4834]: E1008 23:25:12.556541 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:25:27 crc kubenswrapper[4834]: I1008 23:25:27.555984 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:25:27 crc kubenswrapper[4834]: E1008 23:25:27.557135 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:25:41 crc kubenswrapper[4834]: I1008 23:25:41.554983 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:25:41 crc kubenswrapper[4834]: E1008 23:25:41.556490 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:25:53 crc kubenswrapper[4834]: I1008 23:25:53.559486 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:25:53 crc kubenswrapper[4834]: E1008 23:25:53.560569 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:26:07 crc kubenswrapper[4834]: I1008 23:26:07.556013 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:26:07 crc kubenswrapper[4834]: E1008 23:26:07.556975 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:26:22 crc kubenswrapper[4834]: I1008 23:26:22.555809 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:26:22 crc kubenswrapper[4834]: E1008 23:26:22.556648 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:26:33 crc kubenswrapper[4834]: I1008 23:26:33.562729 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:26:33 crc kubenswrapper[4834]: E1008 23:26:33.563809 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:26:46 crc kubenswrapper[4834]: I1008 23:26:46.556534 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:26:46 crc kubenswrapper[4834]: E1008 23:26:46.557610 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:26:57 crc kubenswrapper[4834]: I1008 23:26:57.556786 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:26:57 crc kubenswrapper[4834]: E1008 23:26:57.558023 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:27:09 crc kubenswrapper[4834]: I1008 23:27:09.555554 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:27:09 crc kubenswrapper[4834]: E1008 23:27:09.556429 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:27:21 crc kubenswrapper[4834]: I1008 23:27:21.555371 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:27:21 crc kubenswrapper[4834]: E1008 23:27:21.556649 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:27:36 crc kubenswrapper[4834]: I1008 23:27:36.555375 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:27:36 crc kubenswrapper[4834]: E1008 23:27:36.556727 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:27:50 crc kubenswrapper[4834]: I1008 23:27:50.555352 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:27:50 crc kubenswrapper[4834]: E1008 23:27:50.556427 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:28:01 crc kubenswrapper[4834]: I1008 23:28:01.555973 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:28:01 crc kubenswrapper[4834]: E1008 23:28:01.556965 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:28:16 crc kubenswrapper[4834]: I1008 23:28:16.556213 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:28:16 crc kubenswrapper[4834]: E1008 23:28:16.557385 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:28:28 crc kubenswrapper[4834]: I1008 23:28:28.555823 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:28:28 crc kubenswrapper[4834]: E1008 23:28:28.556693 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:28:39 crc kubenswrapper[4834]: I1008 23:28:39.555927 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:28:39 crc kubenswrapper[4834]: E1008 23:28:39.556549 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:28:52 crc kubenswrapper[4834]: I1008 23:28:52.556446 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:28:53 crc kubenswrapper[4834]: I1008 23:28:53.079633 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"a5e9e52ef77663387f3ef38b160c0950f459925f0fa3b6dc9084e3092d0e0a5c"} Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.159281 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv"] Oct 08 23:30:00 crc kubenswrapper[4834]: E1008 23:30:00.160192 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8717a475-3cbf-404d-a93b-9308baa16f05" containerName="registry-server" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.160211 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8717a475-3cbf-404d-a93b-9308baa16f05" containerName="registry-server" Oct 08 23:30:00 crc kubenswrapper[4834]: E1008 23:30:00.160227 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8717a475-3cbf-404d-a93b-9308baa16f05" containerName="extract-utilities" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.160233 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8717a475-3cbf-404d-a93b-9308baa16f05" containerName="extract-utilities" Oct 08 23:30:00 crc kubenswrapper[4834]: E1008 23:30:00.160257 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8717a475-3cbf-404d-a93b-9308baa16f05" containerName="extract-content" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.160264 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="8717a475-3cbf-404d-a93b-9308baa16f05" containerName="extract-content" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.160426 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="8717a475-3cbf-404d-a93b-9308baa16f05" containerName="registry-server" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.160998 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.164438 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.164442 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.178379 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv"] Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.268778 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de756d70-20be-4337-9d3b-a9b7b5420123-secret-volume\") pod \"collect-profiles-29332770-pvffv\" (UID: \"de756d70-20be-4337-9d3b-a9b7b5420123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.268857 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de756d70-20be-4337-9d3b-a9b7b5420123-config-volume\") pod \"collect-profiles-29332770-pvffv\" (UID: \"de756d70-20be-4337-9d3b-a9b7b5420123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.269296 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjqvf\" (UniqueName: \"kubernetes.io/projected/de756d70-20be-4337-9d3b-a9b7b5420123-kube-api-access-jjqvf\") pod \"collect-profiles-29332770-pvffv\" (UID: \"de756d70-20be-4337-9d3b-a9b7b5420123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.371099 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjqvf\" (UniqueName: \"kubernetes.io/projected/de756d70-20be-4337-9d3b-a9b7b5420123-kube-api-access-jjqvf\") pod \"collect-profiles-29332770-pvffv\" (UID: \"de756d70-20be-4337-9d3b-a9b7b5420123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.371235 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de756d70-20be-4337-9d3b-a9b7b5420123-secret-volume\") pod \"collect-profiles-29332770-pvffv\" (UID: \"de756d70-20be-4337-9d3b-a9b7b5420123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.371294 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de756d70-20be-4337-9d3b-a9b7b5420123-config-volume\") pod \"collect-profiles-29332770-pvffv\" (UID: \"de756d70-20be-4337-9d3b-a9b7b5420123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.373235 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de756d70-20be-4337-9d3b-a9b7b5420123-config-volume\") pod \"collect-profiles-29332770-pvffv\" (UID: \"de756d70-20be-4337-9d3b-a9b7b5420123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.378977 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de756d70-20be-4337-9d3b-a9b7b5420123-secret-volume\") pod \"collect-profiles-29332770-pvffv\" (UID: \"de756d70-20be-4337-9d3b-a9b7b5420123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.387923 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjqvf\" (UniqueName: \"kubernetes.io/projected/de756d70-20be-4337-9d3b-a9b7b5420123-kube-api-access-jjqvf\") pod \"collect-profiles-29332770-pvffv\" (UID: \"de756d70-20be-4337-9d3b-a9b7b5420123\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.487768 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" Oct 08 23:30:00 crc kubenswrapper[4834]: I1008 23:30:00.920841 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv"] Oct 08 23:30:01 crc kubenswrapper[4834]: I1008 23:30:01.703488 4834 generic.go:334] "Generic (PLEG): container finished" podID="de756d70-20be-4337-9d3b-a9b7b5420123" containerID="3f1d3290051a2998f00e8c545c9ea99b45cbd1f234ba9a00e5ed13d682a4a670" exitCode=0 Oct 08 23:30:01 crc kubenswrapper[4834]: I1008 23:30:01.703563 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" event={"ID":"de756d70-20be-4337-9d3b-a9b7b5420123","Type":"ContainerDied","Data":"3f1d3290051a2998f00e8c545c9ea99b45cbd1f234ba9a00e5ed13d682a4a670"} Oct 08 23:30:01 crc kubenswrapper[4834]: I1008 23:30:01.703779 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" event={"ID":"de756d70-20be-4337-9d3b-a9b7b5420123","Type":"ContainerStarted","Data":"14d082ea578680f5c92becc1a9c8faf6c690d97d1ac2425bd8c4c5f57d6c5774"} Oct 08 23:30:03 crc kubenswrapper[4834]: I1008 23:30:03.079995 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" Oct 08 23:30:03 crc kubenswrapper[4834]: I1008 23:30:03.115923 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de756d70-20be-4337-9d3b-a9b7b5420123-secret-volume\") pod \"de756d70-20be-4337-9d3b-a9b7b5420123\" (UID: \"de756d70-20be-4337-9d3b-a9b7b5420123\") " Oct 08 23:30:03 crc kubenswrapper[4834]: I1008 23:30:03.116206 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de756d70-20be-4337-9d3b-a9b7b5420123-config-volume\") pod \"de756d70-20be-4337-9d3b-a9b7b5420123\" (UID: \"de756d70-20be-4337-9d3b-a9b7b5420123\") " Oct 08 23:30:03 crc kubenswrapper[4834]: I1008 23:30:03.116244 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjqvf\" (UniqueName: \"kubernetes.io/projected/de756d70-20be-4337-9d3b-a9b7b5420123-kube-api-access-jjqvf\") pod \"de756d70-20be-4337-9d3b-a9b7b5420123\" (UID: \"de756d70-20be-4337-9d3b-a9b7b5420123\") " Oct 08 23:30:03 crc kubenswrapper[4834]: I1008 23:30:03.116943 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de756d70-20be-4337-9d3b-a9b7b5420123-config-volume" (OuterVolumeSpecName: "config-volume") pod "de756d70-20be-4337-9d3b-a9b7b5420123" (UID: "de756d70-20be-4337-9d3b-a9b7b5420123"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 23:30:03 crc kubenswrapper[4834]: I1008 23:30:03.127474 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de756d70-20be-4337-9d3b-a9b7b5420123-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "de756d70-20be-4337-9d3b-a9b7b5420123" (UID: "de756d70-20be-4337-9d3b-a9b7b5420123"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 23:30:03 crc kubenswrapper[4834]: I1008 23:30:03.127630 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de756d70-20be-4337-9d3b-a9b7b5420123-kube-api-access-jjqvf" (OuterVolumeSpecName: "kube-api-access-jjqvf") pod "de756d70-20be-4337-9d3b-a9b7b5420123" (UID: "de756d70-20be-4337-9d3b-a9b7b5420123"). InnerVolumeSpecName "kube-api-access-jjqvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:30:03 crc kubenswrapper[4834]: I1008 23:30:03.218006 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de756d70-20be-4337-9d3b-a9b7b5420123-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 23:30:03 crc kubenswrapper[4834]: I1008 23:30:03.218055 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de756d70-20be-4337-9d3b-a9b7b5420123-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 23:30:03 crc kubenswrapper[4834]: I1008 23:30:03.218067 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjqvf\" (UniqueName: \"kubernetes.io/projected/de756d70-20be-4337-9d3b-a9b7b5420123-kube-api-access-jjqvf\") on node \"crc\" DevicePath \"\"" Oct 08 23:30:03 crc kubenswrapper[4834]: I1008 23:30:03.734866 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" event={"ID":"de756d70-20be-4337-9d3b-a9b7b5420123","Type":"ContainerDied","Data":"14d082ea578680f5c92becc1a9c8faf6c690d97d1ac2425bd8c4c5f57d6c5774"} Oct 08 23:30:03 crc kubenswrapper[4834]: I1008 23:30:03.735443 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14d082ea578680f5c92becc1a9c8faf6c690d97d1ac2425bd8c4c5f57d6c5774" Oct 08 23:30:03 crc kubenswrapper[4834]: I1008 23:30:03.735034 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332770-pvffv" Oct 08 23:30:04 crc kubenswrapper[4834]: I1008 23:30:04.179226 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds"] Oct 08 23:30:04 crc kubenswrapper[4834]: I1008 23:30:04.185172 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332725-9mxds"] Oct 08 23:30:05 crc kubenswrapper[4834]: I1008 23:30:05.573826 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3073f21-b4aa-4113-964e-d4e157b9d53e" path="/var/lib/kubelet/pods/b3073f21-b4aa-4113-964e-d4e157b9d53e/volumes" Oct 08 23:30:22 crc kubenswrapper[4834]: I1008 23:30:22.855806 4834 scope.go:117] "RemoveContainer" containerID="045bfd6e47c7cb04456f3f30e28a9fb501037bf64345af689d7236e746207bdb" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.020453 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bqbt4"] Oct 08 23:31:13 crc kubenswrapper[4834]: E1008 23:31:13.021843 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de756d70-20be-4337-9d3b-a9b7b5420123" containerName="collect-profiles" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.021875 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="de756d70-20be-4337-9d3b-a9b7b5420123" containerName="collect-profiles" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.022373 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="de756d70-20be-4337-9d3b-a9b7b5420123" containerName="collect-profiles" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.024670 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.035390 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bqbt4"] Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.056191 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-utilities\") pod \"certified-operators-bqbt4\" (UID: \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\") " pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.056394 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-catalog-content\") pod \"certified-operators-bqbt4\" (UID: \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\") " pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.056877 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz9sm\" (UniqueName: \"kubernetes.io/projected/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-kube-api-access-qz9sm\") pod \"certified-operators-bqbt4\" (UID: \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\") " pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.157973 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-catalog-content\") pod \"certified-operators-bqbt4\" (UID: \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\") " pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.158051 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz9sm\" (UniqueName: \"kubernetes.io/projected/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-kube-api-access-qz9sm\") pod \"certified-operators-bqbt4\" (UID: \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\") " pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.158115 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-utilities\") pod \"certified-operators-bqbt4\" (UID: \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\") " pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.158750 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-utilities\") pod \"certified-operators-bqbt4\" (UID: \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\") " pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.158776 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-catalog-content\") pod \"certified-operators-bqbt4\" (UID: \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\") " pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.191445 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz9sm\" (UniqueName: \"kubernetes.io/projected/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-kube-api-access-qz9sm\") pod \"certified-operators-bqbt4\" (UID: \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\") " pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.356985 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:13 crc kubenswrapper[4834]: I1008 23:31:13.837639 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bqbt4"] Oct 08 23:31:14 crc kubenswrapper[4834]: I1008 23:31:14.420545 4834 generic.go:334] "Generic (PLEG): container finished" podID="6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" containerID="79e8b6203e37b8f37ba3e5cc98dfca0cebf9bbf0321437579982a4e94fbb5d94" exitCode=0 Oct 08 23:31:14 crc kubenswrapper[4834]: I1008 23:31:14.420582 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqbt4" event={"ID":"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e","Type":"ContainerDied","Data":"79e8b6203e37b8f37ba3e5cc98dfca0cebf9bbf0321437579982a4e94fbb5d94"} Oct 08 23:31:14 crc kubenswrapper[4834]: I1008 23:31:14.420859 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqbt4" event={"ID":"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e","Type":"ContainerStarted","Data":"4378afd1609a6fd3a572bd6fb5918753f153fbd389478b27d0b3744f775cdba6"} Oct 08 23:31:14 crc kubenswrapper[4834]: I1008 23:31:14.421991 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 23:31:15 crc kubenswrapper[4834]: I1008 23:31:15.432126 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqbt4" event={"ID":"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e","Type":"ContainerStarted","Data":"579ef5b9a5b5f64a247a65f2b4df62bd207fbb5792687fcb87deba68c5ea1c97"} Oct 08 23:31:16 crc kubenswrapper[4834]: I1008 23:31:16.447923 4834 generic.go:334] "Generic (PLEG): container finished" podID="6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" containerID="579ef5b9a5b5f64a247a65f2b4df62bd207fbb5792687fcb87deba68c5ea1c97" exitCode=0 Oct 08 23:31:16 crc kubenswrapper[4834]: I1008 23:31:16.448003 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqbt4" event={"ID":"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e","Type":"ContainerDied","Data":"579ef5b9a5b5f64a247a65f2b4df62bd207fbb5792687fcb87deba68c5ea1c97"} Oct 08 23:31:17 crc kubenswrapper[4834]: I1008 23:31:17.025987 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:31:17 crc kubenswrapper[4834]: I1008 23:31:17.026078 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:31:17 crc kubenswrapper[4834]: I1008 23:31:17.461197 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqbt4" event={"ID":"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e","Type":"ContainerStarted","Data":"d6bb54fbe6cf94cf9c65d7a8fb7b4664e1aca8416ce172127eaf5d13eb8048cb"} Oct 08 23:31:23 crc kubenswrapper[4834]: I1008 23:31:23.357590 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:23 crc kubenswrapper[4834]: I1008 23:31:23.358270 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:23 crc kubenswrapper[4834]: I1008 23:31:23.440392 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:23 crc kubenswrapper[4834]: I1008 23:31:23.465553 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bqbt4" podStartSLOduration=8.823729742 podStartE2EDuration="11.465532036s" podCreationTimestamp="2025-10-08 23:31:12 +0000 UTC" firstStartedPulling="2025-10-08 23:31:14.421791487 +0000 UTC m=+4082.244676233" lastFinishedPulling="2025-10-08 23:31:17.063593741 +0000 UTC m=+4084.886478527" observedRunningTime="2025-10-08 23:31:17.491315595 +0000 UTC m=+4085.314200411" watchObservedRunningTime="2025-10-08 23:31:23.465532036 +0000 UTC m=+4091.288416792" Oct 08 23:31:23 crc kubenswrapper[4834]: I1008 23:31:23.600909 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:23 crc kubenswrapper[4834]: I1008 23:31:23.684696 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bqbt4"] Oct 08 23:31:25 crc kubenswrapper[4834]: I1008 23:31:25.538221 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bqbt4" podUID="6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" containerName="registry-server" containerID="cri-o://d6bb54fbe6cf94cf9c65d7a8fb7b4664e1aca8416ce172127eaf5d13eb8048cb" gracePeriod=2 Oct 08 23:31:26 crc kubenswrapper[4834]: I1008 23:31:26.555084 4834 generic.go:334] "Generic (PLEG): container finished" podID="6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" containerID="d6bb54fbe6cf94cf9c65d7a8fb7b4664e1aca8416ce172127eaf5d13eb8048cb" exitCode=0 Oct 08 23:31:26 crc kubenswrapper[4834]: I1008 23:31:26.555132 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqbt4" event={"ID":"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e","Type":"ContainerDied","Data":"d6bb54fbe6cf94cf9c65d7a8fb7b4664e1aca8416ce172127eaf5d13eb8048cb"} Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.299637 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.383476 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz9sm\" (UniqueName: \"kubernetes.io/projected/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-kube-api-access-qz9sm\") pod \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\" (UID: \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\") " Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.383565 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-utilities\") pod \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\" (UID: \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\") " Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.383685 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-catalog-content\") pod \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\" (UID: \"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e\") " Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.385062 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-utilities" (OuterVolumeSpecName: "utilities") pod "6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" (UID: "6325e5d0-5ff3-4891-a3e5-0aeeca41d93e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.391497 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-kube-api-access-qz9sm" (OuterVolumeSpecName: "kube-api-access-qz9sm") pod "6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" (UID: "6325e5d0-5ff3-4891-a3e5-0aeeca41d93e"). InnerVolumeSpecName "kube-api-access-qz9sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.439601 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" (UID: "6325e5d0-5ff3-4891-a3e5-0aeeca41d93e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.485031 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz9sm\" (UniqueName: \"kubernetes.io/projected/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-kube-api-access-qz9sm\") on node \"crc\" DevicePath \"\"" Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.485056 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.485067 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.567623 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.572613 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqbt4" event={"ID":"6325e5d0-5ff3-4891-a3e5-0aeeca41d93e","Type":"ContainerDied","Data":"4378afd1609a6fd3a572bd6fb5918753f153fbd389478b27d0b3744f775cdba6"} Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.572696 4834 scope.go:117] "RemoveContainer" containerID="d6bb54fbe6cf94cf9c65d7a8fb7b4664e1aca8416ce172127eaf5d13eb8048cb" Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.608423 4834 scope.go:117] "RemoveContainer" containerID="579ef5b9a5b5f64a247a65f2b4df62bd207fbb5792687fcb87deba68c5ea1c97" Oct 08 23:31:27 crc kubenswrapper[4834]: I1008 23:31:27.644637 4834 scope.go:117] "RemoveContainer" containerID="79e8b6203e37b8f37ba3e5cc98dfca0cebf9bbf0321437579982a4e94fbb5d94" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.468949 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sxgvd"] Oct 08 23:31:31 crc kubenswrapper[4834]: E1008 23:31:31.470285 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" containerName="extract-content" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.470314 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" containerName="extract-content" Oct 08 23:31:31 crc kubenswrapper[4834]: E1008 23:31:31.470352 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" containerName="registry-server" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.470369 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" containerName="registry-server" Oct 08 23:31:31 crc kubenswrapper[4834]: E1008 23:31:31.470421 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" containerName="extract-utilities" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.470438 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" containerName="extract-utilities" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.470730 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" containerName="registry-server" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.472943 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.484572 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxgvd"] Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.545132 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ab5e53-5076-4635-8402-ca12dab572db-utilities\") pod \"redhat-marketplace-sxgvd\" (UID: \"39ab5e53-5076-4635-8402-ca12dab572db\") " pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.545229 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ab5e53-5076-4635-8402-ca12dab572db-catalog-content\") pod \"redhat-marketplace-sxgvd\" (UID: \"39ab5e53-5076-4635-8402-ca12dab572db\") " pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.545339 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l6hz\" (UniqueName: \"kubernetes.io/projected/39ab5e53-5076-4635-8402-ca12dab572db-kube-api-access-2l6hz\") pod \"redhat-marketplace-sxgvd\" (UID: \"39ab5e53-5076-4635-8402-ca12dab572db\") " pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.646880 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ab5e53-5076-4635-8402-ca12dab572db-catalog-content\") pod \"redhat-marketplace-sxgvd\" (UID: \"39ab5e53-5076-4635-8402-ca12dab572db\") " pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.647268 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l6hz\" (UniqueName: \"kubernetes.io/projected/39ab5e53-5076-4635-8402-ca12dab572db-kube-api-access-2l6hz\") pod \"redhat-marketplace-sxgvd\" (UID: \"39ab5e53-5076-4635-8402-ca12dab572db\") " pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.647396 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ab5e53-5076-4635-8402-ca12dab572db-utilities\") pod \"redhat-marketplace-sxgvd\" (UID: \"39ab5e53-5076-4635-8402-ca12dab572db\") " pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.647419 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ab5e53-5076-4635-8402-ca12dab572db-catalog-content\") pod \"redhat-marketplace-sxgvd\" (UID: \"39ab5e53-5076-4635-8402-ca12dab572db\") " pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.648261 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ab5e53-5076-4635-8402-ca12dab572db-utilities\") pod \"redhat-marketplace-sxgvd\" (UID: \"39ab5e53-5076-4635-8402-ca12dab572db\") " pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.661207 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ntxws"] Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.662592 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.679867 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ntxws"] Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.748235 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-utilities\") pod \"redhat-operators-ntxws\" (UID: \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\") " pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.748290 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-catalog-content\") pod \"redhat-operators-ntxws\" (UID: \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\") " pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.748374 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pph9v\" (UniqueName: \"kubernetes.io/projected/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-kube-api-access-pph9v\") pod \"redhat-operators-ntxws\" (UID: \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\") " pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.849595 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pph9v\" (UniqueName: \"kubernetes.io/projected/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-kube-api-access-pph9v\") pod \"redhat-operators-ntxws\" (UID: \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\") " pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.849721 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-utilities\") pod \"redhat-operators-ntxws\" (UID: \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\") " pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.849793 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-catalog-content\") pod \"redhat-operators-ntxws\" (UID: \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\") " pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.850268 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-utilities\") pod \"redhat-operators-ntxws\" (UID: \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\") " pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.850328 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-catalog-content\") pod \"redhat-operators-ntxws\" (UID: \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\") " pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.855779 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l6hz\" (UniqueName: \"kubernetes.io/projected/39ab5e53-5076-4635-8402-ca12dab572db-kube-api-access-2l6hz\") pod \"redhat-marketplace-sxgvd\" (UID: \"39ab5e53-5076-4635-8402-ca12dab572db\") " pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.867267 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pph9v\" (UniqueName: \"kubernetes.io/projected/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-kube-api-access-pph9v\") pod \"redhat-operators-ntxws\" (UID: \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\") " pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:31 crc kubenswrapper[4834]: I1008 23:31:31.989938 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:32 crc kubenswrapper[4834]: I1008 23:31:32.131664 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:32 crc kubenswrapper[4834]: I1008 23:31:32.417036 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ntxws"] Oct 08 23:31:32 crc kubenswrapper[4834]: I1008 23:31:32.608903 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntxws" event={"ID":"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d","Type":"ContainerStarted","Data":"7c942c41687e8ff7e33028a9070c78771c8694d423ba5f4cbe3e5b7b68873086"} Oct 08 23:31:32 crc kubenswrapper[4834]: I1008 23:31:32.634908 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxgvd"] Oct 08 23:31:32 crc kubenswrapper[4834]: W1008 23:31:32.637601 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39ab5e53_5076_4635_8402_ca12dab572db.slice/crio-5eacaec795d0adcae161f1d3d9fe338334ed350f58b7bc8e0272e83dfbf89f6a WatchSource:0}: Error finding container 5eacaec795d0adcae161f1d3d9fe338334ed350f58b7bc8e0272e83dfbf89f6a: Status 404 returned error can't find the container with id 5eacaec795d0adcae161f1d3d9fe338334ed350f58b7bc8e0272e83dfbf89f6a Oct 08 23:31:33 crc kubenswrapper[4834]: I1008 23:31:33.624910 4834 generic.go:334] "Generic (PLEG): container finished" podID="39ab5e53-5076-4635-8402-ca12dab572db" containerID="54fb298628d273e9bacedec7733b984b1ca0505c3c50ffd42836d361d4974ced" exitCode=0 Oct 08 23:31:33 crc kubenswrapper[4834]: I1008 23:31:33.625029 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxgvd" event={"ID":"39ab5e53-5076-4635-8402-ca12dab572db","Type":"ContainerDied","Data":"54fb298628d273e9bacedec7733b984b1ca0505c3c50ffd42836d361d4974ced"} Oct 08 23:31:33 crc kubenswrapper[4834]: I1008 23:31:33.625526 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxgvd" event={"ID":"39ab5e53-5076-4635-8402-ca12dab572db","Type":"ContainerStarted","Data":"5eacaec795d0adcae161f1d3d9fe338334ed350f58b7bc8e0272e83dfbf89f6a"} Oct 08 23:31:33 crc kubenswrapper[4834]: I1008 23:31:33.629333 4834 generic.go:334] "Generic (PLEG): container finished" podID="7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" containerID="d3c04dae36584d477693bc5ed47d929bb89eba62b45dbf90cc4243709f6b8da5" exitCode=0 Oct 08 23:31:33 crc kubenswrapper[4834]: I1008 23:31:33.629393 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntxws" event={"ID":"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d","Type":"ContainerDied","Data":"d3c04dae36584d477693bc5ed47d929bb89eba62b45dbf90cc4243709f6b8da5"} Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.089564 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l9xqk"] Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.094114 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.120523 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l9xqk"] Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.293278 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clt4x\" (UniqueName: \"kubernetes.io/projected/452b6539-a6a4-427d-958a-1c0f4790793d-kube-api-access-clt4x\") pod \"community-operators-l9xqk\" (UID: \"452b6539-a6a4-427d-958a-1c0f4790793d\") " pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.293440 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452b6539-a6a4-427d-958a-1c0f4790793d-utilities\") pod \"community-operators-l9xqk\" (UID: \"452b6539-a6a4-427d-958a-1c0f4790793d\") " pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.293463 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452b6539-a6a4-427d-958a-1c0f4790793d-catalog-content\") pod \"community-operators-l9xqk\" (UID: \"452b6539-a6a4-427d-958a-1c0f4790793d\") " pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.395456 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clt4x\" (UniqueName: \"kubernetes.io/projected/452b6539-a6a4-427d-958a-1c0f4790793d-kube-api-access-clt4x\") pod \"community-operators-l9xqk\" (UID: \"452b6539-a6a4-427d-958a-1c0f4790793d\") " pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.395589 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452b6539-a6a4-427d-958a-1c0f4790793d-utilities\") pod \"community-operators-l9xqk\" (UID: \"452b6539-a6a4-427d-958a-1c0f4790793d\") " pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.395635 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452b6539-a6a4-427d-958a-1c0f4790793d-catalog-content\") pod \"community-operators-l9xqk\" (UID: \"452b6539-a6a4-427d-958a-1c0f4790793d\") " pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.396114 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452b6539-a6a4-427d-958a-1c0f4790793d-utilities\") pod \"community-operators-l9xqk\" (UID: \"452b6539-a6a4-427d-958a-1c0f4790793d\") " pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.396268 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452b6539-a6a4-427d-958a-1c0f4790793d-catalog-content\") pod \"community-operators-l9xqk\" (UID: \"452b6539-a6a4-427d-958a-1c0f4790793d\") " pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.422115 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clt4x\" (UniqueName: \"kubernetes.io/projected/452b6539-a6a4-427d-958a-1c0f4790793d-kube-api-access-clt4x\") pod \"community-operators-l9xqk\" (UID: \"452b6539-a6a4-427d-958a-1c0f4790793d\") " pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.441830 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.637606 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxgvd" event={"ID":"39ab5e53-5076-4635-8402-ca12dab572db","Type":"ContainerStarted","Data":"671b842469cabcbe5fe84e424b61ecfc09deb21e7f85b7b613cf3d6833fc97ee"} Oct 08 23:31:34 crc kubenswrapper[4834]: W1008 23:31:34.801113 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod452b6539_a6a4_427d_958a_1c0f4790793d.slice/crio-42b984d1792183ee7b360749e7bde8606d5bb7189e1cf44708e7ac7d64e5047d WatchSource:0}: Error finding container 42b984d1792183ee7b360749e7bde8606d5bb7189e1cf44708e7ac7d64e5047d: Status 404 returned error can't find the container with id 42b984d1792183ee7b360749e7bde8606d5bb7189e1cf44708e7ac7d64e5047d Oct 08 23:31:34 crc kubenswrapper[4834]: I1008 23:31:34.803753 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l9xqk"] Oct 08 23:31:35 crc kubenswrapper[4834]: I1008 23:31:35.650345 4834 generic.go:334] "Generic (PLEG): container finished" podID="39ab5e53-5076-4635-8402-ca12dab572db" containerID="671b842469cabcbe5fe84e424b61ecfc09deb21e7f85b7b613cf3d6833fc97ee" exitCode=0 Oct 08 23:31:35 crc kubenswrapper[4834]: I1008 23:31:35.650475 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxgvd" event={"ID":"39ab5e53-5076-4635-8402-ca12dab572db","Type":"ContainerDied","Data":"671b842469cabcbe5fe84e424b61ecfc09deb21e7f85b7b613cf3d6833fc97ee"} Oct 08 23:31:35 crc kubenswrapper[4834]: I1008 23:31:35.655188 4834 generic.go:334] "Generic (PLEG): container finished" podID="7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" containerID="63bc8f8b70ec0cf9b1910fb6d620951127e4d573ef96df4e976bedcdd648482e" exitCode=0 Oct 08 23:31:35 crc kubenswrapper[4834]: I1008 23:31:35.655282 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntxws" event={"ID":"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d","Type":"ContainerDied","Data":"63bc8f8b70ec0cf9b1910fb6d620951127e4d573ef96df4e976bedcdd648482e"} Oct 08 23:31:35 crc kubenswrapper[4834]: I1008 23:31:35.659185 4834 generic.go:334] "Generic (PLEG): container finished" podID="452b6539-a6a4-427d-958a-1c0f4790793d" containerID="9f79ef184bf52b61ac94c13939dfd076cbda70a36b04f67d6b57e4e58bd6d356" exitCode=0 Oct 08 23:31:35 crc kubenswrapper[4834]: I1008 23:31:35.659256 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9xqk" event={"ID":"452b6539-a6a4-427d-958a-1c0f4790793d","Type":"ContainerDied","Data":"9f79ef184bf52b61ac94c13939dfd076cbda70a36b04f67d6b57e4e58bd6d356"} Oct 08 23:31:35 crc kubenswrapper[4834]: I1008 23:31:35.659294 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9xqk" event={"ID":"452b6539-a6a4-427d-958a-1c0f4790793d","Type":"ContainerStarted","Data":"42b984d1792183ee7b360749e7bde8606d5bb7189e1cf44708e7ac7d64e5047d"} Oct 08 23:31:36 crc kubenswrapper[4834]: I1008 23:31:36.674528 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxgvd" event={"ID":"39ab5e53-5076-4635-8402-ca12dab572db","Type":"ContainerStarted","Data":"7cdc7312962b171693ffb4d3a993159c167afe92eb8303d68f5dfe942c4ad30a"} Oct 08 23:31:36 crc kubenswrapper[4834]: I1008 23:31:36.681612 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntxws" event={"ID":"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d","Type":"ContainerStarted","Data":"aec053f2227e44f143b0ce018302e0ec79f314895042e568e89d11373dd20209"} Oct 08 23:31:36 crc kubenswrapper[4834]: I1008 23:31:36.684261 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9xqk" event={"ID":"452b6539-a6a4-427d-958a-1c0f4790793d","Type":"ContainerStarted","Data":"78f05443bfa30e00d43e7be405de7a83d63d4657b8833f3557b332bb15e52cfd"} Oct 08 23:31:36 crc kubenswrapper[4834]: I1008 23:31:36.706601 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sxgvd" podStartSLOduration=3.283877494 podStartE2EDuration="5.706581769s" podCreationTimestamp="2025-10-08 23:31:31 +0000 UTC" firstStartedPulling="2025-10-08 23:31:33.628725067 +0000 UTC m=+4101.451609843" lastFinishedPulling="2025-10-08 23:31:36.051429332 +0000 UTC m=+4103.874314118" observedRunningTime="2025-10-08 23:31:36.703655097 +0000 UTC m=+4104.526539853" watchObservedRunningTime="2025-10-08 23:31:36.706581769 +0000 UTC m=+4104.529466525" Oct 08 23:31:36 crc kubenswrapper[4834]: I1008 23:31:36.729281 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ntxws" podStartSLOduration=3.197011792 podStartE2EDuration="5.729258022s" podCreationTimestamp="2025-10-08 23:31:31 +0000 UTC" firstStartedPulling="2025-10-08 23:31:33.633803791 +0000 UTC m=+4101.456688547" lastFinishedPulling="2025-10-08 23:31:36.166050001 +0000 UTC m=+4103.988934777" observedRunningTime="2025-10-08 23:31:36.722966268 +0000 UTC m=+4104.545851054" watchObservedRunningTime="2025-10-08 23:31:36.729258022 +0000 UTC m=+4104.552142778" Oct 08 23:31:37 crc kubenswrapper[4834]: I1008 23:31:37.698805 4834 generic.go:334] "Generic (PLEG): container finished" podID="452b6539-a6a4-427d-958a-1c0f4790793d" containerID="78f05443bfa30e00d43e7be405de7a83d63d4657b8833f3557b332bb15e52cfd" exitCode=0 Oct 08 23:31:37 crc kubenswrapper[4834]: I1008 23:31:37.699000 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9xqk" event={"ID":"452b6539-a6a4-427d-958a-1c0f4790793d","Type":"ContainerDied","Data":"78f05443bfa30e00d43e7be405de7a83d63d4657b8833f3557b332bb15e52cfd"} Oct 08 23:31:37 crc kubenswrapper[4834]: I1008 23:31:37.699071 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9xqk" event={"ID":"452b6539-a6a4-427d-958a-1c0f4790793d","Type":"ContainerStarted","Data":"afecc65e916eadbcb9806f0d2ca0dfa0f343018bbc56fc50e851905277a1ceef"} Oct 08 23:31:37 crc kubenswrapper[4834]: I1008 23:31:37.733074 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l9xqk" podStartSLOduration=2.246836392 podStartE2EDuration="3.733044781s" podCreationTimestamp="2025-10-08 23:31:34 +0000 UTC" firstStartedPulling="2025-10-08 23:31:35.66146768 +0000 UTC m=+4103.484352456" lastFinishedPulling="2025-10-08 23:31:37.147676089 +0000 UTC m=+4104.970560845" observedRunningTime="2025-10-08 23:31:37.724972974 +0000 UTC m=+4105.547857750" watchObservedRunningTime="2025-10-08 23:31:37.733044781 +0000 UTC m=+4105.555929567" Oct 08 23:31:41 crc kubenswrapper[4834]: I1008 23:31:41.990280 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:41 crc kubenswrapper[4834]: I1008 23:31:41.990955 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:42 crc kubenswrapper[4834]: I1008 23:31:42.131878 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:42 crc kubenswrapper[4834]: I1008 23:31:42.131967 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:42 crc kubenswrapper[4834]: I1008 23:31:42.178292 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:43 crc kubenswrapper[4834]: I1008 23:31:43.053423 4834 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ntxws" podUID="7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" containerName="registry-server" probeResult="failure" output=< Oct 08 23:31:43 crc kubenswrapper[4834]: timeout: failed to connect service ":50051" within 1s Oct 08 23:31:43 crc kubenswrapper[4834]: > Oct 08 23:31:43 crc kubenswrapper[4834]: I1008 23:31:43.298058 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:44 crc kubenswrapper[4834]: I1008 23:31:44.442625 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:44 crc kubenswrapper[4834]: I1008 23:31:44.442684 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:44 crc kubenswrapper[4834]: I1008 23:31:44.458043 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxgvd"] Oct 08 23:31:44 crc kubenswrapper[4834]: I1008 23:31:44.531417 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:44 crc kubenswrapper[4834]: I1008 23:31:44.822929 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sxgvd" podUID="39ab5e53-5076-4635-8402-ca12dab572db" containerName="registry-server" containerID="cri-o://7cdc7312962b171693ffb4d3a993159c167afe92eb8303d68f5dfe942c4ad30a" gracePeriod=2 Oct 08 23:31:44 crc kubenswrapper[4834]: I1008 23:31:44.881004 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.520047 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.673384 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ab5e53-5076-4635-8402-ca12dab572db-catalog-content\") pod \"39ab5e53-5076-4635-8402-ca12dab572db\" (UID: \"39ab5e53-5076-4635-8402-ca12dab572db\") " Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.673555 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ab5e53-5076-4635-8402-ca12dab572db-utilities\") pod \"39ab5e53-5076-4635-8402-ca12dab572db\" (UID: \"39ab5e53-5076-4635-8402-ca12dab572db\") " Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.673657 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l6hz\" (UniqueName: \"kubernetes.io/projected/39ab5e53-5076-4635-8402-ca12dab572db-kube-api-access-2l6hz\") pod \"39ab5e53-5076-4635-8402-ca12dab572db\" (UID: \"39ab5e53-5076-4635-8402-ca12dab572db\") " Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.675398 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39ab5e53-5076-4635-8402-ca12dab572db-utilities" (OuterVolumeSpecName: "utilities") pod "39ab5e53-5076-4635-8402-ca12dab572db" (UID: "39ab5e53-5076-4635-8402-ca12dab572db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.682480 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39ab5e53-5076-4635-8402-ca12dab572db-kube-api-access-2l6hz" (OuterVolumeSpecName: "kube-api-access-2l6hz") pod "39ab5e53-5076-4635-8402-ca12dab572db" (UID: "39ab5e53-5076-4635-8402-ca12dab572db"). InnerVolumeSpecName "kube-api-access-2l6hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.701393 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39ab5e53-5076-4635-8402-ca12dab572db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39ab5e53-5076-4635-8402-ca12dab572db" (UID: "39ab5e53-5076-4635-8402-ca12dab572db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.775655 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ab5e53-5076-4635-8402-ca12dab572db-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.775685 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ab5e53-5076-4635-8402-ca12dab572db-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.775697 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l6hz\" (UniqueName: \"kubernetes.io/projected/39ab5e53-5076-4635-8402-ca12dab572db-kube-api-access-2l6hz\") on node \"crc\" DevicePath \"\"" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.835188 4834 generic.go:334] "Generic (PLEG): container finished" podID="39ab5e53-5076-4635-8402-ca12dab572db" containerID="7cdc7312962b171693ffb4d3a993159c167afe92eb8303d68f5dfe942c4ad30a" exitCode=0 Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.835248 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxgvd" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.835286 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxgvd" event={"ID":"39ab5e53-5076-4635-8402-ca12dab572db","Type":"ContainerDied","Data":"7cdc7312962b171693ffb4d3a993159c167afe92eb8303d68f5dfe942c4ad30a"} Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.835406 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxgvd" event={"ID":"39ab5e53-5076-4635-8402-ca12dab572db","Type":"ContainerDied","Data":"5eacaec795d0adcae161f1d3d9fe338334ed350f58b7bc8e0272e83dfbf89f6a"} Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.835465 4834 scope.go:117] "RemoveContainer" containerID="7cdc7312962b171693ffb4d3a993159c167afe92eb8303d68f5dfe942c4ad30a" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.875355 4834 scope.go:117] "RemoveContainer" containerID="671b842469cabcbe5fe84e424b61ecfc09deb21e7f85b7b613cf3d6833fc97ee" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.880164 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxgvd"] Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.884598 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxgvd"] Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.905319 4834 scope.go:117] "RemoveContainer" containerID="54fb298628d273e9bacedec7733b984b1ca0505c3c50ffd42836d361d4974ced" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.925947 4834 scope.go:117] "RemoveContainer" containerID="7cdc7312962b171693ffb4d3a993159c167afe92eb8303d68f5dfe942c4ad30a" Oct 08 23:31:45 crc kubenswrapper[4834]: E1008 23:31:45.926791 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cdc7312962b171693ffb4d3a993159c167afe92eb8303d68f5dfe942c4ad30a\": container with ID starting with 7cdc7312962b171693ffb4d3a993159c167afe92eb8303d68f5dfe942c4ad30a not found: ID does not exist" containerID="7cdc7312962b171693ffb4d3a993159c167afe92eb8303d68f5dfe942c4ad30a" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.926841 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cdc7312962b171693ffb4d3a993159c167afe92eb8303d68f5dfe942c4ad30a"} err="failed to get container status \"7cdc7312962b171693ffb4d3a993159c167afe92eb8303d68f5dfe942c4ad30a\": rpc error: code = NotFound desc = could not find container \"7cdc7312962b171693ffb4d3a993159c167afe92eb8303d68f5dfe942c4ad30a\": container with ID starting with 7cdc7312962b171693ffb4d3a993159c167afe92eb8303d68f5dfe942c4ad30a not found: ID does not exist" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.926872 4834 scope.go:117] "RemoveContainer" containerID="671b842469cabcbe5fe84e424b61ecfc09deb21e7f85b7b613cf3d6833fc97ee" Oct 08 23:31:45 crc kubenswrapper[4834]: E1008 23:31:45.927361 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"671b842469cabcbe5fe84e424b61ecfc09deb21e7f85b7b613cf3d6833fc97ee\": container with ID starting with 671b842469cabcbe5fe84e424b61ecfc09deb21e7f85b7b613cf3d6833fc97ee not found: ID does not exist" containerID="671b842469cabcbe5fe84e424b61ecfc09deb21e7f85b7b613cf3d6833fc97ee" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.927392 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"671b842469cabcbe5fe84e424b61ecfc09deb21e7f85b7b613cf3d6833fc97ee"} err="failed to get container status \"671b842469cabcbe5fe84e424b61ecfc09deb21e7f85b7b613cf3d6833fc97ee\": rpc error: code = NotFound desc = could not find container \"671b842469cabcbe5fe84e424b61ecfc09deb21e7f85b7b613cf3d6833fc97ee\": container with ID starting with 671b842469cabcbe5fe84e424b61ecfc09deb21e7f85b7b613cf3d6833fc97ee not found: ID does not exist" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.927410 4834 scope.go:117] "RemoveContainer" containerID="54fb298628d273e9bacedec7733b984b1ca0505c3c50ffd42836d361d4974ced" Oct 08 23:31:45 crc kubenswrapper[4834]: E1008 23:31:45.927816 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54fb298628d273e9bacedec7733b984b1ca0505c3c50ffd42836d361d4974ced\": container with ID starting with 54fb298628d273e9bacedec7733b984b1ca0505c3c50ffd42836d361d4974ced not found: ID does not exist" containerID="54fb298628d273e9bacedec7733b984b1ca0505c3c50ffd42836d361d4974ced" Oct 08 23:31:45 crc kubenswrapper[4834]: I1008 23:31:45.927851 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54fb298628d273e9bacedec7733b984b1ca0505c3c50ffd42836d361d4974ced"} err="failed to get container status \"54fb298628d273e9bacedec7733b984b1ca0505c3c50ffd42836d361d4974ced\": rpc error: code = NotFound desc = could not find container \"54fb298628d273e9bacedec7733b984b1ca0505c3c50ffd42836d361d4974ced\": container with ID starting with 54fb298628d273e9bacedec7733b984b1ca0505c3c50ffd42836d361d4974ced not found: ID does not exist" Oct 08 23:31:46 crc kubenswrapper[4834]: I1008 23:31:46.853356 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l9xqk"] Oct 08 23:31:46 crc kubenswrapper[4834]: I1008 23:31:46.854548 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l9xqk" podUID="452b6539-a6a4-427d-958a-1c0f4790793d" containerName="registry-server" containerID="cri-o://afecc65e916eadbcb9806f0d2ca0dfa0f343018bbc56fc50e851905277a1ceef" gracePeriod=2 Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.027157 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.027308 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.393805 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.506012 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452b6539-a6a4-427d-958a-1c0f4790793d-catalog-content\") pod \"452b6539-a6a4-427d-958a-1c0f4790793d\" (UID: \"452b6539-a6a4-427d-958a-1c0f4790793d\") " Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.506076 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clt4x\" (UniqueName: \"kubernetes.io/projected/452b6539-a6a4-427d-958a-1c0f4790793d-kube-api-access-clt4x\") pod \"452b6539-a6a4-427d-958a-1c0f4790793d\" (UID: \"452b6539-a6a4-427d-958a-1c0f4790793d\") " Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.506122 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452b6539-a6a4-427d-958a-1c0f4790793d-utilities\") pod \"452b6539-a6a4-427d-958a-1c0f4790793d\" (UID: \"452b6539-a6a4-427d-958a-1c0f4790793d\") " Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.507323 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/452b6539-a6a4-427d-958a-1c0f4790793d-utilities" (OuterVolumeSpecName: "utilities") pod "452b6539-a6a4-427d-958a-1c0f4790793d" (UID: "452b6539-a6a4-427d-958a-1c0f4790793d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.515205 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/452b6539-a6a4-427d-958a-1c0f4790793d-kube-api-access-clt4x" (OuterVolumeSpecName: "kube-api-access-clt4x") pod "452b6539-a6a4-427d-958a-1c0f4790793d" (UID: "452b6539-a6a4-427d-958a-1c0f4790793d"). InnerVolumeSpecName "kube-api-access-clt4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.574811 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39ab5e53-5076-4635-8402-ca12dab572db" path="/var/lib/kubelet/pods/39ab5e53-5076-4635-8402-ca12dab572db/volumes" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.607880 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clt4x\" (UniqueName: \"kubernetes.io/projected/452b6539-a6a4-427d-958a-1c0f4790793d-kube-api-access-clt4x\") on node \"crc\" DevicePath \"\"" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.607926 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452b6539-a6a4-427d-958a-1c0f4790793d-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.867292 4834 generic.go:334] "Generic (PLEG): container finished" podID="452b6539-a6a4-427d-958a-1c0f4790793d" containerID="afecc65e916eadbcb9806f0d2ca0dfa0f343018bbc56fc50e851905277a1ceef" exitCode=0 Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.867361 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9xqk" event={"ID":"452b6539-a6a4-427d-958a-1c0f4790793d","Type":"ContainerDied","Data":"afecc65e916eadbcb9806f0d2ca0dfa0f343018bbc56fc50e851905277a1ceef"} Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.867418 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9xqk" event={"ID":"452b6539-a6a4-427d-958a-1c0f4790793d","Type":"ContainerDied","Data":"42b984d1792183ee7b360749e7bde8606d5bb7189e1cf44708e7ac7d64e5047d"} Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.867453 4834 scope.go:117] "RemoveContainer" containerID="afecc65e916eadbcb9806f0d2ca0dfa0f343018bbc56fc50e851905277a1ceef" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.867466 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9xqk" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.901306 4834 scope.go:117] "RemoveContainer" containerID="78f05443bfa30e00d43e7be405de7a83d63d4657b8833f3557b332bb15e52cfd" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.903923 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/452b6539-a6a4-427d-958a-1c0f4790793d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "452b6539-a6a4-427d-958a-1c0f4790793d" (UID: "452b6539-a6a4-427d-958a-1c0f4790793d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.913820 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452b6539-a6a4-427d-958a-1c0f4790793d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.927373 4834 scope.go:117] "RemoveContainer" containerID="9f79ef184bf52b61ac94c13939dfd076cbda70a36b04f67d6b57e4e58bd6d356" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.984725 4834 scope.go:117] "RemoveContainer" containerID="afecc65e916eadbcb9806f0d2ca0dfa0f343018bbc56fc50e851905277a1ceef" Oct 08 23:31:47 crc kubenswrapper[4834]: E1008 23:31:47.985670 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afecc65e916eadbcb9806f0d2ca0dfa0f343018bbc56fc50e851905277a1ceef\": container with ID starting with afecc65e916eadbcb9806f0d2ca0dfa0f343018bbc56fc50e851905277a1ceef not found: ID does not exist" containerID="afecc65e916eadbcb9806f0d2ca0dfa0f343018bbc56fc50e851905277a1ceef" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.985896 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afecc65e916eadbcb9806f0d2ca0dfa0f343018bbc56fc50e851905277a1ceef"} err="failed to get container status \"afecc65e916eadbcb9806f0d2ca0dfa0f343018bbc56fc50e851905277a1ceef\": rpc error: code = NotFound desc = could not find container \"afecc65e916eadbcb9806f0d2ca0dfa0f343018bbc56fc50e851905277a1ceef\": container with ID starting with afecc65e916eadbcb9806f0d2ca0dfa0f343018bbc56fc50e851905277a1ceef not found: ID does not exist" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.986112 4834 scope.go:117] "RemoveContainer" containerID="78f05443bfa30e00d43e7be405de7a83d63d4657b8833f3557b332bb15e52cfd" Oct 08 23:31:47 crc kubenswrapper[4834]: E1008 23:31:47.987015 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78f05443bfa30e00d43e7be405de7a83d63d4657b8833f3557b332bb15e52cfd\": container with ID starting with 78f05443bfa30e00d43e7be405de7a83d63d4657b8833f3557b332bb15e52cfd not found: ID does not exist" containerID="78f05443bfa30e00d43e7be405de7a83d63d4657b8833f3557b332bb15e52cfd" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.987060 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78f05443bfa30e00d43e7be405de7a83d63d4657b8833f3557b332bb15e52cfd"} err="failed to get container status \"78f05443bfa30e00d43e7be405de7a83d63d4657b8833f3557b332bb15e52cfd\": rpc error: code = NotFound desc = could not find container \"78f05443bfa30e00d43e7be405de7a83d63d4657b8833f3557b332bb15e52cfd\": container with ID starting with 78f05443bfa30e00d43e7be405de7a83d63d4657b8833f3557b332bb15e52cfd not found: ID does not exist" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.987089 4834 scope.go:117] "RemoveContainer" containerID="9f79ef184bf52b61ac94c13939dfd076cbda70a36b04f67d6b57e4e58bd6d356" Oct 08 23:31:47 crc kubenswrapper[4834]: E1008 23:31:47.987938 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f79ef184bf52b61ac94c13939dfd076cbda70a36b04f67d6b57e4e58bd6d356\": container with ID starting with 9f79ef184bf52b61ac94c13939dfd076cbda70a36b04f67d6b57e4e58bd6d356 not found: ID does not exist" containerID="9f79ef184bf52b61ac94c13939dfd076cbda70a36b04f67d6b57e4e58bd6d356" Oct 08 23:31:47 crc kubenswrapper[4834]: I1008 23:31:47.988002 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f79ef184bf52b61ac94c13939dfd076cbda70a36b04f67d6b57e4e58bd6d356"} err="failed to get container status \"9f79ef184bf52b61ac94c13939dfd076cbda70a36b04f67d6b57e4e58bd6d356\": rpc error: code = NotFound desc = could not find container \"9f79ef184bf52b61ac94c13939dfd076cbda70a36b04f67d6b57e4e58bd6d356\": container with ID starting with 9f79ef184bf52b61ac94c13939dfd076cbda70a36b04f67d6b57e4e58bd6d356 not found: ID does not exist" Oct 08 23:31:48 crc kubenswrapper[4834]: I1008 23:31:48.228651 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l9xqk"] Oct 08 23:31:48 crc kubenswrapper[4834]: I1008 23:31:48.238272 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l9xqk"] Oct 08 23:31:49 crc kubenswrapper[4834]: I1008 23:31:49.570414 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="452b6539-a6a4-427d-958a-1c0f4790793d" path="/var/lib/kubelet/pods/452b6539-a6a4-427d-958a-1c0f4790793d/volumes" Oct 08 23:31:52 crc kubenswrapper[4834]: I1008 23:31:52.068581 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:52 crc kubenswrapper[4834]: I1008 23:31:52.158775 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:52 crc kubenswrapper[4834]: I1008 23:31:52.317391 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ntxws"] Oct 08 23:31:53 crc kubenswrapper[4834]: I1008 23:31:53.932056 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ntxws" podUID="7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" containerName="registry-server" containerID="cri-o://aec053f2227e44f143b0ce018302e0ec79f314895042e568e89d11373dd20209" gracePeriod=2 Oct 08 23:31:54 crc kubenswrapper[4834]: I1008 23:31:54.944973 4834 generic.go:334] "Generic (PLEG): container finished" podID="7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" containerID="aec053f2227e44f143b0ce018302e0ec79f314895042e568e89d11373dd20209" exitCode=0 Oct 08 23:31:54 crc kubenswrapper[4834]: I1008 23:31:54.945054 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntxws" event={"ID":"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d","Type":"ContainerDied","Data":"aec053f2227e44f143b0ce018302e0ec79f314895042e568e89d11373dd20209"} Oct 08 23:31:54 crc kubenswrapper[4834]: I1008 23:31:54.945428 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ntxws" event={"ID":"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d","Type":"ContainerDied","Data":"7c942c41687e8ff7e33028a9070c78771c8694d423ba5f4cbe3e5b7b68873086"} Oct 08 23:31:54 crc kubenswrapper[4834]: I1008 23:31:54.945449 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c942c41687e8ff7e33028a9070c78771c8694d423ba5f4cbe3e5b7b68873086" Oct 08 23:31:54 crc kubenswrapper[4834]: I1008 23:31:54.978687 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:55 crc kubenswrapper[4834]: I1008 23:31:55.140478 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-utilities\") pod \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\" (UID: \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\") " Oct 08 23:31:55 crc kubenswrapper[4834]: I1008 23:31:55.140536 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-catalog-content\") pod \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\" (UID: \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\") " Oct 08 23:31:55 crc kubenswrapper[4834]: I1008 23:31:55.140628 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pph9v\" (UniqueName: \"kubernetes.io/projected/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-kube-api-access-pph9v\") pod \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\" (UID: \"7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d\") " Oct 08 23:31:55 crc kubenswrapper[4834]: I1008 23:31:55.142122 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-utilities" (OuterVolumeSpecName: "utilities") pod "7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" (UID: "7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:31:55 crc kubenswrapper[4834]: I1008 23:31:55.146069 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-kube-api-access-pph9v" (OuterVolumeSpecName: "kube-api-access-pph9v") pod "7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" (UID: "7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d"). InnerVolumeSpecName "kube-api-access-pph9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:31:55 crc kubenswrapper[4834]: I1008 23:31:55.242649 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:31:55 crc kubenswrapper[4834]: I1008 23:31:55.242693 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pph9v\" (UniqueName: \"kubernetes.io/projected/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-kube-api-access-pph9v\") on node \"crc\" DevicePath \"\"" Oct 08 23:31:55 crc kubenswrapper[4834]: I1008 23:31:55.272354 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" (UID: "7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:31:55 crc kubenswrapper[4834]: I1008 23:31:55.343705 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:31:55 crc kubenswrapper[4834]: I1008 23:31:55.954920 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ntxws" Oct 08 23:31:55 crc kubenswrapper[4834]: I1008 23:31:55.986216 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ntxws"] Oct 08 23:31:55 crc kubenswrapper[4834]: I1008 23:31:55.996578 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ntxws"] Oct 08 23:31:57 crc kubenswrapper[4834]: I1008 23:31:57.570209 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" path="/var/lib/kubelet/pods/7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d/volumes" Oct 08 23:31:57 crc kubenswrapper[4834]: I1008 23:31:57.586031 4834 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pod6325e5d0-5ff3-4891-a3e5-0aeeca41d93e"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pod6325e5d0-5ff3-4891-a3e5-0aeeca41d93e] : Timed out while waiting for systemd to remove kubepods-burstable-pod6325e5d0_5ff3_4891_a3e5_0aeeca41d93e.slice" Oct 08 23:31:57 crc kubenswrapper[4834]: E1008 23:31:57.586334 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable pod6325e5d0-5ff3-4891-a3e5-0aeeca41d93e] : unable to destroy cgroup paths for cgroup [kubepods burstable pod6325e5d0-5ff3-4891-a3e5-0aeeca41d93e] : Timed out while waiting for systemd to remove kubepods-burstable-pod6325e5d0_5ff3_4891_a3e5_0aeeca41d93e.slice" pod="openshift-marketplace/certified-operators-bqbt4" podUID="6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" Oct 08 23:31:57 crc kubenswrapper[4834]: I1008 23:31:57.975318 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bqbt4" Oct 08 23:31:58 crc kubenswrapper[4834]: I1008 23:31:58.000442 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bqbt4"] Oct 08 23:31:58 crc kubenswrapper[4834]: I1008 23:31:58.013610 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bqbt4"] Oct 08 23:31:59 crc kubenswrapper[4834]: I1008 23:31:59.572634 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6325e5d0-5ff3-4891-a3e5-0aeeca41d93e" path="/var/lib/kubelet/pods/6325e5d0-5ff3-4891-a3e5-0aeeca41d93e/volumes" Oct 08 23:32:17 crc kubenswrapper[4834]: I1008 23:32:17.025334 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:32:17 crc kubenswrapper[4834]: I1008 23:32:17.026098 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:32:17 crc kubenswrapper[4834]: I1008 23:32:17.026219 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 23:32:17 crc kubenswrapper[4834]: I1008 23:32:17.027181 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a5e9e52ef77663387f3ef38b160c0950f459925f0fa3b6dc9084e3092d0e0a5c"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 23:32:17 crc kubenswrapper[4834]: I1008 23:32:17.027288 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://a5e9e52ef77663387f3ef38b160c0950f459925f0fa3b6dc9084e3092d0e0a5c" gracePeriod=600 Oct 08 23:32:17 crc kubenswrapper[4834]: I1008 23:32:17.171292 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="a5e9e52ef77663387f3ef38b160c0950f459925f0fa3b6dc9084e3092d0e0a5c" exitCode=0 Oct 08 23:32:17 crc kubenswrapper[4834]: I1008 23:32:17.171355 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"a5e9e52ef77663387f3ef38b160c0950f459925f0fa3b6dc9084e3092d0e0a5c"} Oct 08 23:32:17 crc kubenswrapper[4834]: I1008 23:32:17.171411 4834 scope.go:117] "RemoveContainer" containerID="6d7d742f232f962e6e40df6e9085a73321a30d2c6a19a435f9a98cf75d0bcd4a" Oct 08 23:32:18 crc kubenswrapper[4834]: I1008 23:32:18.183319 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35"} Oct 08 23:34:17 crc kubenswrapper[4834]: I1008 23:34:17.025534 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:34:17 crc kubenswrapper[4834]: I1008 23:34:17.026207 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:34:47 crc kubenswrapper[4834]: I1008 23:34:47.025265 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:34:47 crc kubenswrapper[4834]: I1008 23:34:47.026426 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:35:17 crc kubenswrapper[4834]: I1008 23:35:17.026104 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:35:17 crc kubenswrapper[4834]: I1008 23:35:17.027043 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:35:17 crc kubenswrapper[4834]: I1008 23:35:17.027133 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 23:35:17 crc kubenswrapper[4834]: I1008 23:35:17.028227 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 23:35:17 crc kubenswrapper[4834]: I1008 23:35:17.028356 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" gracePeriod=600 Oct 08 23:35:17 crc kubenswrapper[4834]: E1008 23:35:17.165270 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:35:17 crc kubenswrapper[4834]: I1008 23:35:17.988387 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" exitCode=0 Oct 08 23:35:17 crc kubenswrapper[4834]: I1008 23:35:17.988476 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35"} Oct 08 23:35:17 crc kubenswrapper[4834]: I1008 23:35:17.988591 4834 scope.go:117] "RemoveContainer" containerID="a5e9e52ef77663387f3ef38b160c0950f459925f0fa3b6dc9084e3092d0e0a5c" Oct 08 23:35:17 crc kubenswrapper[4834]: I1008 23:35:17.989264 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:35:17 crc kubenswrapper[4834]: E1008 23:35:17.989823 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:35:33 crc kubenswrapper[4834]: I1008 23:35:33.564118 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:35:33 crc kubenswrapper[4834]: E1008 23:35:33.565243 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:35:45 crc kubenswrapper[4834]: I1008 23:35:45.556256 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:35:45 crc kubenswrapper[4834]: E1008 23:35:45.557328 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:35:58 crc kubenswrapper[4834]: I1008 23:35:58.555610 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:35:58 crc kubenswrapper[4834]: E1008 23:35:58.556313 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:36:11 crc kubenswrapper[4834]: I1008 23:36:11.555846 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:36:11 crc kubenswrapper[4834]: E1008 23:36:11.556717 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:36:26 crc kubenswrapper[4834]: I1008 23:36:26.557102 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:36:26 crc kubenswrapper[4834]: E1008 23:36:26.558579 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:36:41 crc kubenswrapper[4834]: I1008 23:36:41.556786 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:36:41 crc kubenswrapper[4834]: E1008 23:36:41.557766 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:36:55 crc kubenswrapper[4834]: I1008 23:36:55.555930 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:36:55 crc kubenswrapper[4834]: E1008 23:36:55.556880 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:37:10 crc kubenswrapper[4834]: I1008 23:37:10.556192 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:37:10 crc kubenswrapper[4834]: E1008 23:37:10.557369 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:37:21 crc kubenswrapper[4834]: I1008 23:37:21.556333 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:37:21 crc kubenswrapper[4834]: E1008 23:37:21.557790 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:37:36 crc kubenswrapper[4834]: I1008 23:37:36.555933 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:37:36 crc kubenswrapper[4834]: E1008 23:37:36.556687 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:37:50 crc kubenswrapper[4834]: I1008 23:37:50.555088 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:37:50 crc kubenswrapper[4834]: E1008 23:37:50.555961 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:38:01 crc kubenswrapper[4834]: I1008 23:38:01.556024 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:38:01 crc kubenswrapper[4834]: E1008 23:38:01.557265 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:38:15 crc kubenswrapper[4834]: I1008 23:38:15.555996 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:38:15 crc kubenswrapper[4834]: E1008 23:38:15.557779 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:38:23 crc kubenswrapper[4834]: I1008 23:38:23.165447 4834 scope.go:117] "RemoveContainer" containerID="63bc8f8b70ec0cf9b1910fb6d620951127e4d573ef96df4e976bedcdd648482e" Oct 08 23:38:23 crc kubenswrapper[4834]: I1008 23:38:23.208017 4834 scope.go:117] "RemoveContainer" containerID="d3c04dae36584d477693bc5ed47d929bb89eba62b45dbf90cc4243709f6b8da5" Oct 08 23:38:23 crc kubenswrapper[4834]: I1008 23:38:23.234604 4834 scope.go:117] "RemoveContainer" containerID="aec053f2227e44f143b0ce018302e0ec79f314895042e568e89d11373dd20209" Oct 08 23:38:29 crc kubenswrapper[4834]: I1008 23:38:29.555625 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:38:29 crc kubenswrapper[4834]: E1008 23:38:29.556364 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:38:42 crc kubenswrapper[4834]: I1008 23:38:42.556033 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:38:42 crc kubenswrapper[4834]: E1008 23:38:42.557064 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:38:57 crc kubenswrapper[4834]: I1008 23:38:57.565529 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:38:57 crc kubenswrapper[4834]: E1008 23:38:57.566649 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:39:08 crc kubenswrapper[4834]: I1008 23:39:08.556283 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:39:08 crc kubenswrapper[4834]: E1008 23:39:08.557770 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:39:20 crc kubenswrapper[4834]: I1008 23:39:20.555721 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:39:20 crc kubenswrapper[4834]: E1008 23:39:20.556628 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:39:32 crc kubenswrapper[4834]: I1008 23:39:32.555949 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:39:32 crc kubenswrapper[4834]: E1008 23:39:32.556748 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:39:46 crc kubenswrapper[4834]: I1008 23:39:46.556355 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:39:46 crc kubenswrapper[4834]: E1008 23:39:46.557611 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:39:58 crc kubenswrapper[4834]: I1008 23:39:58.556296 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:39:58 crc kubenswrapper[4834]: E1008 23:39:58.557539 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:40:11 crc kubenswrapper[4834]: I1008 23:40:11.556423 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:40:11 crc kubenswrapper[4834]: E1008 23:40:11.557493 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:40:24 crc kubenswrapper[4834]: I1008 23:40:24.555540 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:40:24 crc kubenswrapper[4834]: I1008 23:40:24.922025 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"276ba5956ea31e01e4d465b12ffad880ddc21751d64b35842ae87952c0e6e1f3"} Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.334475 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c856n"] Oct 08 23:41:36 crc kubenswrapper[4834]: E1008 23:41:36.335598 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" containerName="extract-utilities" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.335622 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" containerName="extract-utilities" Oct 08 23:41:36 crc kubenswrapper[4834]: E1008 23:41:36.335642 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" containerName="extract-content" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.335654 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" containerName="extract-content" Oct 08 23:41:36 crc kubenswrapper[4834]: E1008 23:41:36.335692 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ab5e53-5076-4635-8402-ca12dab572db" containerName="registry-server" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.335705 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ab5e53-5076-4635-8402-ca12dab572db" containerName="registry-server" Oct 08 23:41:36 crc kubenswrapper[4834]: E1008 23:41:36.335720 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452b6539-a6a4-427d-958a-1c0f4790793d" containerName="extract-content" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.335734 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="452b6539-a6a4-427d-958a-1c0f4790793d" containerName="extract-content" Oct 08 23:41:36 crc kubenswrapper[4834]: E1008 23:41:36.335753 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" containerName="registry-server" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.335765 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" containerName="registry-server" Oct 08 23:41:36 crc kubenswrapper[4834]: E1008 23:41:36.335783 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452b6539-a6a4-427d-958a-1c0f4790793d" containerName="registry-server" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.335796 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="452b6539-a6a4-427d-958a-1c0f4790793d" containerName="registry-server" Oct 08 23:41:36 crc kubenswrapper[4834]: E1008 23:41:36.335811 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452b6539-a6a4-427d-958a-1c0f4790793d" containerName="extract-utilities" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.335823 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="452b6539-a6a4-427d-958a-1c0f4790793d" containerName="extract-utilities" Oct 08 23:41:36 crc kubenswrapper[4834]: E1008 23:41:36.335850 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ab5e53-5076-4635-8402-ca12dab572db" containerName="extract-content" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.335863 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ab5e53-5076-4635-8402-ca12dab572db" containerName="extract-content" Oct 08 23:41:36 crc kubenswrapper[4834]: E1008 23:41:36.336177 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ab5e53-5076-4635-8402-ca12dab572db" containerName="extract-utilities" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.336197 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ab5e53-5076-4635-8402-ca12dab572db" containerName="extract-utilities" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.336445 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="452b6539-a6a4-427d-958a-1c0f4790793d" containerName="registry-server" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.336494 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="39ab5e53-5076-4635-8402-ca12dab572db" containerName="registry-server" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.336514 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da01afb-5cf9-4b2f-9d29-3f6ae7435e2d" containerName="registry-server" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.339700 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.346619 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jt5b\" (UniqueName: \"kubernetes.io/projected/dd3924e3-9d33-45b4-ace8-36eb0aa33199-kube-api-access-8jt5b\") pod \"redhat-marketplace-c856n\" (UID: \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\") " pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.346689 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3924e3-9d33-45b4-ace8-36eb0aa33199-utilities\") pod \"redhat-marketplace-c856n\" (UID: \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\") " pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.346723 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3924e3-9d33-45b4-ace8-36eb0aa33199-catalog-content\") pod \"redhat-marketplace-c856n\" (UID: \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\") " pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.351175 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c856n"] Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.447834 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jt5b\" (UniqueName: \"kubernetes.io/projected/dd3924e3-9d33-45b4-ace8-36eb0aa33199-kube-api-access-8jt5b\") pod \"redhat-marketplace-c856n\" (UID: \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\") " pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.447923 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3924e3-9d33-45b4-ace8-36eb0aa33199-utilities\") pod \"redhat-marketplace-c856n\" (UID: \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\") " pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.447947 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3924e3-9d33-45b4-ace8-36eb0aa33199-catalog-content\") pod \"redhat-marketplace-c856n\" (UID: \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\") " pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.448843 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3924e3-9d33-45b4-ace8-36eb0aa33199-utilities\") pod \"redhat-marketplace-c856n\" (UID: \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\") " pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.448942 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3924e3-9d33-45b4-ace8-36eb0aa33199-catalog-content\") pod \"redhat-marketplace-c856n\" (UID: \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\") " pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.472857 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jt5b\" (UniqueName: \"kubernetes.io/projected/dd3924e3-9d33-45b4-ace8-36eb0aa33199-kube-api-access-8jt5b\") pod \"redhat-marketplace-c856n\" (UID: \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\") " pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:36 crc kubenswrapper[4834]: I1008 23:41:36.703830 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:37 crc kubenswrapper[4834]: I1008 23:41:37.168905 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c856n"] Oct 08 23:41:37 crc kubenswrapper[4834]: W1008 23:41:37.184117 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd3924e3_9d33_45b4_ace8_36eb0aa33199.slice/crio-6547ca9728b5d3757994e9fd439f3080d7612b7f0a328345129d8076096872d8 WatchSource:0}: Error finding container 6547ca9728b5d3757994e9fd439f3080d7612b7f0a328345129d8076096872d8: Status 404 returned error can't find the container with id 6547ca9728b5d3757994e9fd439f3080d7612b7f0a328345129d8076096872d8 Oct 08 23:41:37 crc kubenswrapper[4834]: I1008 23:41:37.621015 4834 generic.go:334] "Generic (PLEG): container finished" podID="dd3924e3-9d33-45b4-ace8-36eb0aa33199" containerID="949601708f5f6a6eb0bae38a8fa0a96bd1090b1dc446ed2a87ddc6c151b1d557" exitCode=0 Oct 08 23:41:37 crc kubenswrapper[4834]: I1008 23:41:37.621081 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c856n" event={"ID":"dd3924e3-9d33-45b4-ace8-36eb0aa33199","Type":"ContainerDied","Data":"949601708f5f6a6eb0bae38a8fa0a96bd1090b1dc446ed2a87ddc6c151b1d557"} Oct 08 23:41:37 crc kubenswrapper[4834]: I1008 23:41:37.621120 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c856n" event={"ID":"dd3924e3-9d33-45b4-ace8-36eb0aa33199","Type":"ContainerStarted","Data":"6547ca9728b5d3757994e9fd439f3080d7612b7f0a328345129d8076096872d8"} Oct 08 23:41:37 crc kubenswrapper[4834]: I1008 23:41:37.624138 4834 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 08 23:41:39 crc kubenswrapper[4834]: I1008 23:41:39.654758 4834 generic.go:334] "Generic (PLEG): container finished" podID="dd3924e3-9d33-45b4-ace8-36eb0aa33199" containerID="a63110e378f95cbb345a1d8c2ec54c4d7910df95c1d72a224c2c73df37c4224a" exitCode=0 Oct 08 23:41:39 crc kubenswrapper[4834]: I1008 23:41:39.654843 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c856n" event={"ID":"dd3924e3-9d33-45b4-ace8-36eb0aa33199","Type":"ContainerDied","Data":"a63110e378f95cbb345a1d8c2ec54c4d7910df95c1d72a224c2c73df37c4224a"} Oct 08 23:41:40 crc kubenswrapper[4834]: I1008 23:41:40.670556 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c856n" event={"ID":"dd3924e3-9d33-45b4-ace8-36eb0aa33199","Type":"ContainerStarted","Data":"b82a8d428aedbb52b5354fe388ac31ed2b928fe1e0a15f4a36fd4ead3795bf8e"} Oct 08 23:41:40 crc kubenswrapper[4834]: I1008 23:41:40.703380 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c856n" podStartSLOduration=2.221559527 podStartE2EDuration="4.703348293s" podCreationTimestamp="2025-10-08 23:41:36 +0000 UTC" firstStartedPulling="2025-10-08 23:41:37.623652276 +0000 UTC m=+4705.446537052" lastFinishedPulling="2025-10-08 23:41:40.105441032 +0000 UTC m=+4707.928325818" observedRunningTime="2025-10-08 23:41:40.69759281 +0000 UTC m=+4708.520477596" watchObservedRunningTime="2025-10-08 23:41:40.703348293 +0000 UTC m=+4708.526233099" Oct 08 23:41:46 crc kubenswrapper[4834]: I1008 23:41:46.704849 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:46 crc kubenswrapper[4834]: I1008 23:41:46.705589 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:46 crc kubenswrapper[4834]: I1008 23:41:46.783032 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:46 crc kubenswrapper[4834]: I1008 23:41:46.868053 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:47 crc kubenswrapper[4834]: I1008 23:41:47.037022 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c856n"] Oct 08 23:41:48 crc kubenswrapper[4834]: I1008 23:41:48.755039 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c856n" podUID="dd3924e3-9d33-45b4-ace8-36eb0aa33199" containerName="registry-server" containerID="cri-o://b82a8d428aedbb52b5354fe388ac31ed2b928fe1e0a15f4a36fd4ead3795bf8e" gracePeriod=2 Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.321020 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.473757 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jt5b\" (UniqueName: \"kubernetes.io/projected/dd3924e3-9d33-45b4-ace8-36eb0aa33199-kube-api-access-8jt5b\") pod \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\" (UID: \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\") " Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.473854 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3924e3-9d33-45b4-ace8-36eb0aa33199-utilities\") pod \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\" (UID: \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\") " Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.473932 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3924e3-9d33-45b4-ace8-36eb0aa33199-catalog-content\") pod \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\" (UID: \"dd3924e3-9d33-45b4-ace8-36eb0aa33199\") " Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.475474 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd3924e3-9d33-45b4-ace8-36eb0aa33199-utilities" (OuterVolumeSpecName: "utilities") pod "dd3924e3-9d33-45b4-ace8-36eb0aa33199" (UID: "dd3924e3-9d33-45b4-ace8-36eb0aa33199"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.486357 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3924e3-9d33-45b4-ace8-36eb0aa33199-kube-api-access-8jt5b" (OuterVolumeSpecName: "kube-api-access-8jt5b") pod "dd3924e3-9d33-45b4-ace8-36eb0aa33199" (UID: "dd3924e3-9d33-45b4-ace8-36eb0aa33199"). InnerVolumeSpecName "kube-api-access-8jt5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.489444 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd3924e3-9d33-45b4-ace8-36eb0aa33199-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd3924e3-9d33-45b4-ace8-36eb0aa33199" (UID: "dd3924e3-9d33-45b4-ace8-36eb0aa33199"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.575845 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jt5b\" (UniqueName: \"kubernetes.io/projected/dd3924e3-9d33-45b4-ace8-36eb0aa33199-kube-api-access-8jt5b\") on node \"crc\" DevicePath \"\"" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.575917 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd3924e3-9d33-45b4-ace8-36eb0aa33199-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.575944 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd3924e3-9d33-45b4-ace8-36eb0aa33199-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.766933 4834 generic.go:334] "Generic (PLEG): container finished" podID="dd3924e3-9d33-45b4-ace8-36eb0aa33199" containerID="b82a8d428aedbb52b5354fe388ac31ed2b928fe1e0a15f4a36fd4ead3795bf8e" exitCode=0 Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.766984 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c856n" event={"ID":"dd3924e3-9d33-45b4-ace8-36eb0aa33199","Type":"ContainerDied","Data":"b82a8d428aedbb52b5354fe388ac31ed2b928fe1e0a15f4a36fd4ead3795bf8e"} Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.766994 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c856n" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.767016 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c856n" event={"ID":"dd3924e3-9d33-45b4-ace8-36eb0aa33199","Type":"ContainerDied","Data":"6547ca9728b5d3757994e9fd439f3080d7612b7f0a328345129d8076096872d8"} Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.767040 4834 scope.go:117] "RemoveContainer" containerID="b82a8d428aedbb52b5354fe388ac31ed2b928fe1e0a15f4a36fd4ead3795bf8e" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.803663 4834 scope.go:117] "RemoveContainer" containerID="a63110e378f95cbb345a1d8c2ec54c4d7910df95c1d72a224c2c73df37c4224a" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.812743 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c856n"] Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.817770 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c856n"] Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.843735 4834 scope.go:117] "RemoveContainer" containerID="949601708f5f6a6eb0bae38a8fa0a96bd1090b1dc446ed2a87ddc6c151b1d557" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.873200 4834 scope.go:117] "RemoveContainer" containerID="b82a8d428aedbb52b5354fe388ac31ed2b928fe1e0a15f4a36fd4ead3795bf8e" Oct 08 23:41:49 crc kubenswrapper[4834]: E1008 23:41:49.873644 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b82a8d428aedbb52b5354fe388ac31ed2b928fe1e0a15f4a36fd4ead3795bf8e\": container with ID starting with b82a8d428aedbb52b5354fe388ac31ed2b928fe1e0a15f4a36fd4ead3795bf8e not found: ID does not exist" containerID="b82a8d428aedbb52b5354fe388ac31ed2b928fe1e0a15f4a36fd4ead3795bf8e" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.873682 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b82a8d428aedbb52b5354fe388ac31ed2b928fe1e0a15f4a36fd4ead3795bf8e"} err="failed to get container status \"b82a8d428aedbb52b5354fe388ac31ed2b928fe1e0a15f4a36fd4ead3795bf8e\": rpc error: code = NotFound desc = could not find container \"b82a8d428aedbb52b5354fe388ac31ed2b928fe1e0a15f4a36fd4ead3795bf8e\": container with ID starting with b82a8d428aedbb52b5354fe388ac31ed2b928fe1e0a15f4a36fd4ead3795bf8e not found: ID does not exist" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.873703 4834 scope.go:117] "RemoveContainer" containerID="a63110e378f95cbb345a1d8c2ec54c4d7910df95c1d72a224c2c73df37c4224a" Oct 08 23:41:49 crc kubenswrapper[4834]: E1008 23:41:49.873945 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a63110e378f95cbb345a1d8c2ec54c4d7910df95c1d72a224c2c73df37c4224a\": container with ID starting with a63110e378f95cbb345a1d8c2ec54c4d7910df95c1d72a224c2c73df37c4224a not found: ID does not exist" containerID="a63110e378f95cbb345a1d8c2ec54c4d7910df95c1d72a224c2c73df37c4224a" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.873970 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a63110e378f95cbb345a1d8c2ec54c4d7910df95c1d72a224c2c73df37c4224a"} err="failed to get container status \"a63110e378f95cbb345a1d8c2ec54c4d7910df95c1d72a224c2c73df37c4224a\": rpc error: code = NotFound desc = could not find container \"a63110e378f95cbb345a1d8c2ec54c4d7910df95c1d72a224c2c73df37c4224a\": container with ID starting with a63110e378f95cbb345a1d8c2ec54c4d7910df95c1d72a224c2c73df37c4224a not found: ID does not exist" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.873988 4834 scope.go:117] "RemoveContainer" containerID="949601708f5f6a6eb0bae38a8fa0a96bd1090b1dc446ed2a87ddc6c151b1d557" Oct 08 23:41:49 crc kubenswrapper[4834]: E1008 23:41:49.874283 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"949601708f5f6a6eb0bae38a8fa0a96bd1090b1dc446ed2a87ddc6c151b1d557\": container with ID starting with 949601708f5f6a6eb0bae38a8fa0a96bd1090b1dc446ed2a87ddc6c151b1d557 not found: ID does not exist" containerID="949601708f5f6a6eb0bae38a8fa0a96bd1090b1dc446ed2a87ddc6c151b1d557" Oct 08 23:41:49 crc kubenswrapper[4834]: I1008 23:41:49.874301 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949601708f5f6a6eb0bae38a8fa0a96bd1090b1dc446ed2a87ddc6c151b1d557"} err="failed to get container status \"949601708f5f6a6eb0bae38a8fa0a96bd1090b1dc446ed2a87ddc6c151b1d557\": rpc error: code = NotFound desc = could not find container \"949601708f5f6a6eb0bae38a8fa0a96bd1090b1dc446ed2a87ddc6c151b1d557\": container with ID starting with 949601708f5f6a6eb0bae38a8fa0a96bd1090b1dc446ed2a87ddc6c151b1d557 not found: ID does not exist" Oct 08 23:41:51 crc kubenswrapper[4834]: I1008 23:41:51.572214 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd3924e3-9d33-45b4-ace8-36eb0aa33199" path="/var/lib/kubelet/pods/dd3924e3-9d33-45b4-ace8-36eb0aa33199/volumes" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.255548 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xtptz"] Oct 08 23:42:06 crc kubenswrapper[4834]: E1008 23:42:06.256349 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3924e3-9d33-45b4-ace8-36eb0aa33199" containerName="registry-server" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.256361 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3924e3-9d33-45b4-ace8-36eb0aa33199" containerName="registry-server" Oct 08 23:42:06 crc kubenswrapper[4834]: E1008 23:42:06.256378 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3924e3-9d33-45b4-ace8-36eb0aa33199" containerName="extract-content" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.256386 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3924e3-9d33-45b4-ace8-36eb0aa33199" containerName="extract-content" Oct 08 23:42:06 crc kubenswrapper[4834]: E1008 23:42:06.256399 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3924e3-9d33-45b4-ace8-36eb0aa33199" containerName="extract-utilities" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.256407 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3924e3-9d33-45b4-ace8-36eb0aa33199" containerName="extract-utilities" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.256570 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3924e3-9d33-45b4-ace8-36eb0aa33199" containerName="registry-server" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.257707 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.278488 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xtptz"] Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.327017 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecdb421f-d736-4863-8d23-46a8c7697ad5-utilities\") pod \"redhat-operators-xtptz\" (UID: \"ecdb421f-d736-4863-8d23-46a8c7697ad5\") " pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.327205 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khkqg\" (UniqueName: \"kubernetes.io/projected/ecdb421f-d736-4863-8d23-46a8c7697ad5-kube-api-access-khkqg\") pod \"redhat-operators-xtptz\" (UID: \"ecdb421f-d736-4863-8d23-46a8c7697ad5\") " pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.327270 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecdb421f-d736-4863-8d23-46a8c7697ad5-catalog-content\") pod \"redhat-operators-xtptz\" (UID: \"ecdb421f-d736-4863-8d23-46a8c7697ad5\") " pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.428875 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecdb421f-d736-4863-8d23-46a8c7697ad5-catalog-content\") pod \"redhat-operators-xtptz\" (UID: \"ecdb421f-d736-4863-8d23-46a8c7697ad5\") " pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.429382 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecdb421f-d736-4863-8d23-46a8c7697ad5-utilities\") pod \"redhat-operators-xtptz\" (UID: \"ecdb421f-d736-4863-8d23-46a8c7697ad5\") " pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.429536 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khkqg\" (UniqueName: \"kubernetes.io/projected/ecdb421f-d736-4863-8d23-46a8c7697ad5-kube-api-access-khkqg\") pod \"redhat-operators-xtptz\" (UID: \"ecdb421f-d736-4863-8d23-46a8c7697ad5\") " pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.429768 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecdb421f-d736-4863-8d23-46a8c7697ad5-catalog-content\") pod \"redhat-operators-xtptz\" (UID: \"ecdb421f-d736-4863-8d23-46a8c7697ad5\") " pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.429875 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecdb421f-d736-4863-8d23-46a8c7697ad5-utilities\") pod \"redhat-operators-xtptz\" (UID: \"ecdb421f-d736-4863-8d23-46a8c7697ad5\") " pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.448650 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khkqg\" (UniqueName: \"kubernetes.io/projected/ecdb421f-d736-4863-8d23-46a8c7697ad5-kube-api-access-khkqg\") pod \"redhat-operators-xtptz\" (UID: \"ecdb421f-d736-4863-8d23-46a8c7697ad5\") " pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:06 crc kubenswrapper[4834]: I1008 23:42:06.575130 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:07 crc kubenswrapper[4834]: I1008 23:42:07.008337 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xtptz"] Oct 08 23:42:07 crc kubenswrapper[4834]: I1008 23:42:07.981334 4834 generic.go:334] "Generic (PLEG): container finished" podID="ecdb421f-d736-4863-8d23-46a8c7697ad5" containerID="18febc76fc87670343760245cd7e6b9a75758214d35ad4c6fd2bb65747217d5a" exitCode=0 Oct 08 23:42:07 crc kubenswrapper[4834]: I1008 23:42:07.981449 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtptz" event={"ID":"ecdb421f-d736-4863-8d23-46a8c7697ad5","Type":"ContainerDied","Data":"18febc76fc87670343760245cd7e6b9a75758214d35ad4c6fd2bb65747217d5a"} Oct 08 23:42:07 crc kubenswrapper[4834]: I1008 23:42:07.982770 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtptz" event={"ID":"ecdb421f-d736-4863-8d23-46a8c7697ad5","Type":"ContainerStarted","Data":"2af10cbe40e9973b4a2120fb3e020af3d2601d2a7f853baa57f0bbe1e1e2586a"} Oct 08 23:42:08 crc kubenswrapper[4834]: I1008 23:42:08.995361 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtptz" event={"ID":"ecdb421f-d736-4863-8d23-46a8c7697ad5","Type":"ContainerStarted","Data":"40c1c757fc0568050b7ec9576a997e86c9b023d0ebe129a40cd27308f24f9762"} Oct 08 23:42:10 crc kubenswrapper[4834]: I1008 23:42:10.005861 4834 generic.go:334] "Generic (PLEG): container finished" podID="ecdb421f-d736-4863-8d23-46a8c7697ad5" containerID="40c1c757fc0568050b7ec9576a997e86c9b023d0ebe129a40cd27308f24f9762" exitCode=0 Oct 08 23:42:10 crc kubenswrapper[4834]: I1008 23:42:10.005909 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtptz" event={"ID":"ecdb421f-d736-4863-8d23-46a8c7697ad5","Type":"ContainerDied","Data":"40c1c757fc0568050b7ec9576a997e86c9b023d0ebe129a40cd27308f24f9762"} Oct 08 23:42:11 crc kubenswrapper[4834]: I1008 23:42:11.018510 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtptz" event={"ID":"ecdb421f-d736-4863-8d23-46a8c7697ad5","Type":"ContainerStarted","Data":"204fcb5a38de6d8f86caea4189c00b2eab5f9ed5ef0fe3cbf347f375863bed8f"} Oct 08 23:42:11 crc kubenswrapper[4834]: I1008 23:42:11.043264 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xtptz" podStartSLOduration=2.583730147 podStartE2EDuration="5.041122813s" podCreationTimestamp="2025-10-08 23:42:06 +0000 UTC" firstStartedPulling="2025-10-08 23:42:07.983890127 +0000 UTC m=+4735.806774883" lastFinishedPulling="2025-10-08 23:42:10.441282773 +0000 UTC m=+4738.264167549" observedRunningTime="2025-10-08 23:42:11.036442438 +0000 UTC m=+4738.859327184" watchObservedRunningTime="2025-10-08 23:42:11.041122813 +0000 UTC m=+4738.864007559" Oct 08 23:42:16 crc kubenswrapper[4834]: I1008 23:42:16.576311 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:16 crc kubenswrapper[4834]: I1008 23:42:16.579262 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:16 crc kubenswrapper[4834]: I1008 23:42:16.903624 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:17 crc kubenswrapper[4834]: I1008 23:42:17.149855 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:17 crc kubenswrapper[4834]: I1008 23:42:17.219084 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xtptz"] Oct 08 23:42:19 crc kubenswrapper[4834]: I1008 23:42:19.086291 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xtptz" podUID="ecdb421f-d736-4863-8d23-46a8c7697ad5" containerName="registry-server" containerID="cri-o://204fcb5a38de6d8f86caea4189c00b2eab5f9ed5ef0fe3cbf347f375863bed8f" gracePeriod=2 Oct 08 23:42:19 crc kubenswrapper[4834]: I1008 23:42:19.607906 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:19 crc kubenswrapper[4834]: I1008 23:42:19.724832 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khkqg\" (UniqueName: \"kubernetes.io/projected/ecdb421f-d736-4863-8d23-46a8c7697ad5-kube-api-access-khkqg\") pod \"ecdb421f-d736-4863-8d23-46a8c7697ad5\" (UID: \"ecdb421f-d736-4863-8d23-46a8c7697ad5\") " Oct 08 23:42:19 crc kubenswrapper[4834]: I1008 23:42:19.724922 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecdb421f-d736-4863-8d23-46a8c7697ad5-catalog-content\") pod \"ecdb421f-d736-4863-8d23-46a8c7697ad5\" (UID: \"ecdb421f-d736-4863-8d23-46a8c7697ad5\") " Oct 08 23:42:19 crc kubenswrapper[4834]: I1008 23:42:19.724976 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecdb421f-d736-4863-8d23-46a8c7697ad5-utilities\") pod \"ecdb421f-d736-4863-8d23-46a8c7697ad5\" (UID: \"ecdb421f-d736-4863-8d23-46a8c7697ad5\") " Oct 08 23:42:19 crc kubenswrapper[4834]: I1008 23:42:19.726322 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecdb421f-d736-4863-8d23-46a8c7697ad5-utilities" (OuterVolumeSpecName: "utilities") pod "ecdb421f-d736-4863-8d23-46a8c7697ad5" (UID: "ecdb421f-d736-4863-8d23-46a8c7697ad5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:42:19 crc kubenswrapper[4834]: I1008 23:42:19.730851 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecdb421f-d736-4863-8d23-46a8c7697ad5-kube-api-access-khkqg" (OuterVolumeSpecName: "kube-api-access-khkqg") pod "ecdb421f-d736-4863-8d23-46a8c7697ad5" (UID: "ecdb421f-d736-4863-8d23-46a8c7697ad5"). InnerVolumeSpecName "kube-api-access-khkqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:42:19 crc kubenswrapper[4834]: I1008 23:42:19.826622 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khkqg\" (UniqueName: \"kubernetes.io/projected/ecdb421f-d736-4863-8d23-46a8c7697ad5-kube-api-access-khkqg\") on node \"crc\" DevicePath \"\"" Oct 08 23:42:19 crc kubenswrapper[4834]: I1008 23:42:19.826669 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ecdb421f-d736-4863-8d23-46a8c7697ad5-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.096366 4834 generic.go:334] "Generic (PLEG): container finished" podID="ecdb421f-d736-4863-8d23-46a8c7697ad5" containerID="204fcb5a38de6d8f86caea4189c00b2eab5f9ed5ef0fe3cbf347f375863bed8f" exitCode=0 Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.096426 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtptz" event={"ID":"ecdb421f-d736-4863-8d23-46a8c7697ad5","Type":"ContainerDied","Data":"204fcb5a38de6d8f86caea4189c00b2eab5f9ed5ef0fe3cbf347f375863bed8f"} Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.096451 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtptz" Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.096469 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtptz" event={"ID":"ecdb421f-d736-4863-8d23-46a8c7697ad5","Type":"ContainerDied","Data":"2af10cbe40e9973b4a2120fb3e020af3d2601d2a7f853baa57f0bbe1e1e2586a"} Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.096496 4834 scope.go:117] "RemoveContainer" containerID="204fcb5a38de6d8f86caea4189c00b2eab5f9ed5ef0fe3cbf347f375863bed8f" Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.117885 4834 scope.go:117] "RemoveContainer" containerID="40c1c757fc0568050b7ec9576a997e86c9b023d0ebe129a40cd27308f24f9762" Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.142722 4834 scope.go:117] "RemoveContainer" containerID="18febc76fc87670343760245cd7e6b9a75758214d35ad4c6fd2bb65747217d5a" Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.172716 4834 scope.go:117] "RemoveContainer" containerID="204fcb5a38de6d8f86caea4189c00b2eab5f9ed5ef0fe3cbf347f375863bed8f" Oct 08 23:42:20 crc kubenswrapper[4834]: E1008 23:42:20.173426 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"204fcb5a38de6d8f86caea4189c00b2eab5f9ed5ef0fe3cbf347f375863bed8f\": container with ID starting with 204fcb5a38de6d8f86caea4189c00b2eab5f9ed5ef0fe3cbf347f375863bed8f not found: ID does not exist" containerID="204fcb5a38de6d8f86caea4189c00b2eab5f9ed5ef0fe3cbf347f375863bed8f" Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.173469 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"204fcb5a38de6d8f86caea4189c00b2eab5f9ed5ef0fe3cbf347f375863bed8f"} err="failed to get container status \"204fcb5a38de6d8f86caea4189c00b2eab5f9ed5ef0fe3cbf347f375863bed8f\": rpc error: code = NotFound desc = could not find container \"204fcb5a38de6d8f86caea4189c00b2eab5f9ed5ef0fe3cbf347f375863bed8f\": container with ID starting with 204fcb5a38de6d8f86caea4189c00b2eab5f9ed5ef0fe3cbf347f375863bed8f not found: ID does not exist" Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.173497 4834 scope.go:117] "RemoveContainer" containerID="40c1c757fc0568050b7ec9576a997e86c9b023d0ebe129a40cd27308f24f9762" Oct 08 23:42:20 crc kubenswrapper[4834]: E1008 23:42:20.174321 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40c1c757fc0568050b7ec9576a997e86c9b023d0ebe129a40cd27308f24f9762\": container with ID starting with 40c1c757fc0568050b7ec9576a997e86c9b023d0ebe129a40cd27308f24f9762 not found: ID does not exist" containerID="40c1c757fc0568050b7ec9576a997e86c9b023d0ebe129a40cd27308f24f9762" Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.174356 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c1c757fc0568050b7ec9576a997e86c9b023d0ebe129a40cd27308f24f9762"} err="failed to get container status \"40c1c757fc0568050b7ec9576a997e86c9b023d0ebe129a40cd27308f24f9762\": rpc error: code = NotFound desc = could not find container \"40c1c757fc0568050b7ec9576a997e86c9b023d0ebe129a40cd27308f24f9762\": container with ID starting with 40c1c757fc0568050b7ec9576a997e86c9b023d0ebe129a40cd27308f24f9762 not found: ID does not exist" Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.174377 4834 scope.go:117] "RemoveContainer" containerID="18febc76fc87670343760245cd7e6b9a75758214d35ad4c6fd2bb65747217d5a" Oct 08 23:42:20 crc kubenswrapper[4834]: E1008 23:42:20.174721 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18febc76fc87670343760245cd7e6b9a75758214d35ad4c6fd2bb65747217d5a\": container with ID starting with 18febc76fc87670343760245cd7e6b9a75758214d35ad4c6fd2bb65747217d5a not found: ID does not exist" containerID="18febc76fc87670343760245cd7e6b9a75758214d35ad4c6fd2bb65747217d5a" Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.174739 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18febc76fc87670343760245cd7e6b9a75758214d35ad4c6fd2bb65747217d5a"} err="failed to get container status \"18febc76fc87670343760245cd7e6b9a75758214d35ad4c6fd2bb65747217d5a\": rpc error: code = NotFound desc = could not find container \"18febc76fc87670343760245cd7e6b9a75758214d35ad4c6fd2bb65747217d5a\": container with ID starting with 18febc76fc87670343760245cd7e6b9a75758214d35ad4c6fd2bb65747217d5a not found: ID does not exist" Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.516626 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecdb421f-d736-4863-8d23-46a8c7697ad5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ecdb421f-d736-4863-8d23-46a8c7697ad5" (UID: "ecdb421f-d736-4863-8d23-46a8c7697ad5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.537613 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ecdb421f-d736-4863-8d23-46a8c7697ad5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.743054 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xtptz"] Oct 08 23:42:20 crc kubenswrapper[4834]: I1008 23:42:20.747588 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xtptz"] Oct 08 23:42:21 crc kubenswrapper[4834]: I1008 23:42:21.573944 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecdb421f-d736-4863-8d23-46a8c7697ad5" path="/var/lib/kubelet/pods/ecdb421f-d736-4863-8d23-46a8c7697ad5/volumes" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.679795 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gwpf4"] Oct 08 23:42:28 crc kubenswrapper[4834]: E1008 23:42:28.681217 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecdb421f-d736-4863-8d23-46a8c7697ad5" containerName="extract-content" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.681254 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdb421f-d736-4863-8d23-46a8c7697ad5" containerName="extract-content" Oct 08 23:42:28 crc kubenswrapper[4834]: E1008 23:42:28.681324 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecdb421f-d736-4863-8d23-46a8c7697ad5" containerName="extract-utilities" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.681345 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdb421f-d736-4863-8d23-46a8c7697ad5" containerName="extract-utilities" Oct 08 23:42:28 crc kubenswrapper[4834]: E1008 23:42:28.681374 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecdb421f-d736-4863-8d23-46a8c7697ad5" containerName="registry-server" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.681392 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdb421f-d736-4863-8d23-46a8c7697ad5" containerName="registry-server" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.681774 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecdb421f-d736-4863-8d23-46a8c7697ad5" containerName="registry-server" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.686039 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.701570 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwpf4"] Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.766678 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f8729b-61ef-4fd3-9819-31e21a9fa616-catalog-content\") pod \"certified-operators-gwpf4\" (UID: \"36f8729b-61ef-4fd3-9819-31e21a9fa616\") " pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.766763 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgblp\" (UniqueName: \"kubernetes.io/projected/36f8729b-61ef-4fd3-9819-31e21a9fa616-kube-api-access-fgblp\") pod \"certified-operators-gwpf4\" (UID: \"36f8729b-61ef-4fd3-9819-31e21a9fa616\") " pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.766827 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f8729b-61ef-4fd3-9819-31e21a9fa616-utilities\") pod \"certified-operators-gwpf4\" (UID: \"36f8729b-61ef-4fd3-9819-31e21a9fa616\") " pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.868912 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f8729b-61ef-4fd3-9819-31e21a9fa616-catalog-content\") pod \"certified-operators-gwpf4\" (UID: \"36f8729b-61ef-4fd3-9819-31e21a9fa616\") " pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.869061 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgblp\" (UniqueName: \"kubernetes.io/projected/36f8729b-61ef-4fd3-9819-31e21a9fa616-kube-api-access-fgblp\") pod \"certified-operators-gwpf4\" (UID: \"36f8729b-61ef-4fd3-9819-31e21a9fa616\") " pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.869177 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f8729b-61ef-4fd3-9819-31e21a9fa616-utilities\") pod \"certified-operators-gwpf4\" (UID: \"36f8729b-61ef-4fd3-9819-31e21a9fa616\") " pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.870042 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f8729b-61ef-4fd3-9819-31e21a9fa616-utilities\") pod \"certified-operators-gwpf4\" (UID: \"36f8729b-61ef-4fd3-9819-31e21a9fa616\") " pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.870742 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f8729b-61ef-4fd3-9819-31e21a9fa616-catalog-content\") pod \"certified-operators-gwpf4\" (UID: \"36f8729b-61ef-4fd3-9819-31e21a9fa616\") " pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:28 crc kubenswrapper[4834]: I1008 23:42:28.898499 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgblp\" (UniqueName: \"kubernetes.io/projected/36f8729b-61ef-4fd3-9819-31e21a9fa616-kube-api-access-fgblp\") pod \"certified-operators-gwpf4\" (UID: \"36f8729b-61ef-4fd3-9819-31e21a9fa616\") " pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:29 crc kubenswrapper[4834]: I1008 23:42:29.050500 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:29 crc kubenswrapper[4834]: I1008 23:42:29.765430 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwpf4"] Oct 08 23:42:29 crc kubenswrapper[4834]: W1008 23:42:29.780236 4834 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36f8729b_61ef_4fd3_9819_31e21a9fa616.slice/crio-1d23aa25b77027f39843faa7db34c7edb393f9062c91377d81b0b2b2ca8ea602 WatchSource:0}: Error finding container 1d23aa25b77027f39843faa7db34c7edb393f9062c91377d81b0b2b2ca8ea602: Status 404 returned error can't find the container with id 1d23aa25b77027f39843faa7db34c7edb393f9062c91377d81b0b2b2ca8ea602 Oct 08 23:42:30 crc kubenswrapper[4834]: I1008 23:42:30.213445 4834 generic.go:334] "Generic (PLEG): container finished" podID="36f8729b-61ef-4fd3-9819-31e21a9fa616" containerID="d22f236e94e3fe34fb3686827c4f06e56edf55b9b7048931475871cf26b6fc0d" exitCode=0 Oct 08 23:42:30 crc kubenswrapper[4834]: I1008 23:42:30.213536 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpf4" event={"ID":"36f8729b-61ef-4fd3-9819-31e21a9fa616","Type":"ContainerDied","Data":"d22f236e94e3fe34fb3686827c4f06e56edf55b9b7048931475871cf26b6fc0d"} Oct 08 23:42:30 crc kubenswrapper[4834]: I1008 23:42:30.214238 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpf4" event={"ID":"36f8729b-61ef-4fd3-9819-31e21a9fa616","Type":"ContainerStarted","Data":"1d23aa25b77027f39843faa7db34c7edb393f9062c91377d81b0b2b2ca8ea602"} Oct 08 23:42:31 crc kubenswrapper[4834]: I1008 23:42:31.227322 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpf4" event={"ID":"36f8729b-61ef-4fd3-9819-31e21a9fa616","Type":"ContainerStarted","Data":"c0c719efd020efcb334ef98f98f18a3b51ced68f2aedac319a4704c587dc2830"} Oct 08 23:42:32 crc kubenswrapper[4834]: I1008 23:42:32.237918 4834 generic.go:334] "Generic (PLEG): container finished" podID="36f8729b-61ef-4fd3-9819-31e21a9fa616" containerID="c0c719efd020efcb334ef98f98f18a3b51ced68f2aedac319a4704c587dc2830" exitCode=0 Oct 08 23:42:32 crc kubenswrapper[4834]: I1008 23:42:32.237967 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpf4" event={"ID":"36f8729b-61ef-4fd3-9819-31e21a9fa616","Type":"ContainerDied","Data":"c0c719efd020efcb334ef98f98f18a3b51ced68f2aedac319a4704c587dc2830"} Oct 08 23:42:33 crc kubenswrapper[4834]: I1008 23:42:33.262461 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpf4" event={"ID":"36f8729b-61ef-4fd3-9819-31e21a9fa616","Type":"ContainerStarted","Data":"be32ba6fbca22cc6913b9ffe68a9baac6e1669b999098169e64f6d2921d93c90"} Oct 08 23:42:33 crc kubenswrapper[4834]: I1008 23:42:33.283431 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gwpf4" podStartSLOduration=2.773834338 podStartE2EDuration="5.283411412s" podCreationTimestamp="2025-10-08 23:42:28 +0000 UTC" firstStartedPulling="2025-10-08 23:42:30.216559999 +0000 UTC m=+4758.039444785" lastFinishedPulling="2025-10-08 23:42:32.726137113 +0000 UTC m=+4760.549021859" observedRunningTime="2025-10-08 23:42:33.278348137 +0000 UTC m=+4761.101232883" watchObservedRunningTime="2025-10-08 23:42:33.283411412 +0000 UTC m=+4761.106296158" Oct 08 23:42:39 crc kubenswrapper[4834]: I1008 23:42:39.052077 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:39 crc kubenswrapper[4834]: I1008 23:42:39.052736 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:39 crc kubenswrapper[4834]: I1008 23:42:39.130095 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:39 crc kubenswrapper[4834]: I1008 23:42:39.386825 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:39 crc kubenswrapper[4834]: I1008 23:42:39.458250 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwpf4"] Oct 08 23:42:41 crc kubenswrapper[4834]: I1008 23:42:41.340311 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gwpf4" podUID="36f8729b-61ef-4fd3-9819-31e21a9fa616" containerName="registry-server" containerID="cri-o://be32ba6fbca22cc6913b9ffe68a9baac6e1669b999098169e64f6d2921d93c90" gracePeriod=2 Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.345552 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.351069 4834 generic.go:334] "Generic (PLEG): container finished" podID="36f8729b-61ef-4fd3-9819-31e21a9fa616" containerID="be32ba6fbca22cc6913b9ffe68a9baac6e1669b999098169e64f6d2921d93c90" exitCode=0 Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.351133 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpf4" event={"ID":"36f8729b-61ef-4fd3-9819-31e21a9fa616","Type":"ContainerDied","Data":"be32ba6fbca22cc6913b9ffe68a9baac6e1669b999098169e64f6d2921d93c90"} Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.351205 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwpf4" event={"ID":"36f8729b-61ef-4fd3-9819-31e21a9fa616","Type":"ContainerDied","Data":"1d23aa25b77027f39843faa7db34c7edb393f9062c91377d81b0b2b2ca8ea602"} Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.351229 4834 scope.go:117] "RemoveContainer" containerID="be32ba6fbca22cc6913b9ffe68a9baac6e1669b999098169e64f6d2921d93c90" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.351286 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwpf4" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.412829 4834 scope.go:117] "RemoveContainer" containerID="c0c719efd020efcb334ef98f98f18a3b51ced68f2aedac319a4704c587dc2830" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.416887 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgblp\" (UniqueName: \"kubernetes.io/projected/36f8729b-61ef-4fd3-9819-31e21a9fa616-kube-api-access-fgblp\") pod \"36f8729b-61ef-4fd3-9819-31e21a9fa616\" (UID: \"36f8729b-61ef-4fd3-9819-31e21a9fa616\") " Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.416955 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f8729b-61ef-4fd3-9819-31e21a9fa616-catalog-content\") pod \"36f8729b-61ef-4fd3-9819-31e21a9fa616\" (UID: \"36f8729b-61ef-4fd3-9819-31e21a9fa616\") " Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.417220 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f8729b-61ef-4fd3-9819-31e21a9fa616-utilities\") pod \"36f8729b-61ef-4fd3-9819-31e21a9fa616\" (UID: \"36f8729b-61ef-4fd3-9819-31e21a9fa616\") " Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.418491 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36f8729b-61ef-4fd3-9819-31e21a9fa616-utilities" (OuterVolumeSpecName: "utilities") pod "36f8729b-61ef-4fd3-9819-31e21a9fa616" (UID: "36f8729b-61ef-4fd3-9819-31e21a9fa616"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.418670 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36f8729b-61ef-4fd3-9819-31e21a9fa616-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.426465 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36f8729b-61ef-4fd3-9819-31e21a9fa616-kube-api-access-fgblp" (OuterVolumeSpecName: "kube-api-access-fgblp") pod "36f8729b-61ef-4fd3-9819-31e21a9fa616" (UID: "36f8729b-61ef-4fd3-9819-31e21a9fa616"). InnerVolumeSpecName "kube-api-access-fgblp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.451868 4834 scope.go:117] "RemoveContainer" containerID="d22f236e94e3fe34fb3686827c4f06e56edf55b9b7048931475871cf26b6fc0d" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.471178 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36f8729b-61ef-4fd3-9819-31e21a9fa616-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36f8729b-61ef-4fd3-9819-31e21a9fa616" (UID: "36f8729b-61ef-4fd3-9819-31e21a9fa616"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.499789 4834 scope.go:117] "RemoveContainer" containerID="be32ba6fbca22cc6913b9ffe68a9baac6e1669b999098169e64f6d2921d93c90" Oct 08 23:42:42 crc kubenswrapper[4834]: E1008 23:42:42.500748 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be32ba6fbca22cc6913b9ffe68a9baac6e1669b999098169e64f6d2921d93c90\": container with ID starting with be32ba6fbca22cc6913b9ffe68a9baac6e1669b999098169e64f6d2921d93c90 not found: ID does not exist" containerID="be32ba6fbca22cc6913b9ffe68a9baac6e1669b999098169e64f6d2921d93c90" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.500820 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be32ba6fbca22cc6913b9ffe68a9baac6e1669b999098169e64f6d2921d93c90"} err="failed to get container status \"be32ba6fbca22cc6913b9ffe68a9baac6e1669b999098169e64f6d2921d93c90\": rpc error: code = NotFound desc = could not find container \"be32ba6fbca22cc6913b9ffe68a9baac6e1669b999098169e64f6d2921d93c90\": container with ID starting with be32ba6fbca22cc6913b9ffe68a9baac6e1669b999098169e64f6d2921d93c90 not found: ID does not exist" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.500865 4834 scope.go:117] "RemoveContainer" containerID="c0c719efd020efcb334ef98f98f18a3b51ced68f2aedac319a4704c587dc2830" Oct 08 23:42:42 crc kubenswrapper[4834]: E1008 23:42:42.501521 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c719efd020efcb334ef98f98f18a3b51ced68f2aedac319a4704c587dc2830\": container with ID starting with c0c719efd020efcb334ef98f98f18a3b51ced68f2aedac319a4704c587dc2830 not found: ID does not exist" containerID="c0c719efd020efcb334ef98f98f18a3b51ced68f2aedac319a4704c587dc2830" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.501553 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c719efd020efcb334ef98f98f18a3b51ced68f2aedac319a4704c587dc2830"} err="failed to get container status \"c0c719efd020efcb334ef98f98f18a3b51ced68f2aedac319a4704c587dc2830\": rpc error: code = NotFound desc = could not find container \"c0c719efd020efcb334ef98f98f18a3b51ced68f2aedac319a4704c587dc2830\": container with ID starting with c0c719efd020efcb334ef98f98f18a3b51ced68f2aedac319a4704c587dc2830 not found: ID does not exist" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.501575 4834 scope.go:117] "RemoveContainer" containerID="d22f236e94e3fe34fb3686827c4f06e56edf55b9b7048931475871cf26b6fc0d" Oct 08 23:42:42 crc kubenswrapper[4834]: E1008 23:42:42.502028 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d22f236e94e3fe34fb3686827c4f06e56edf55b9b7048931475871cf26b6fc0d\": container with ID starting with d22f236e94e3fe34fb3686827c4f06e56edf55b9b7048931475871cf26b6fc0d not found: ID does not exist" containerID="d22f236e94e3fe34fb3686827c4f06e56edf55b9b7048931475871cf26b6fc0d" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.502084 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d22f236e94e3fe34fb3686827c4f06e56edf55b9b7048931475871cf26b6fc0d"} err="failed to get container status \"d22f236e94e3fe34fb3686827c4f06e56edf55b9b7048931475871cf26b6fc0d\": rpc error: code = NotFound desc = could not find container \"d22f236e94e3fe34fb3686827c4f06e56edf55b9b7048931475871cf26b6fc0d\": container with ID starting with d22f236e94e3fe34fb3686827c4f06e56edf55b9b7048931475871cf26b6fc0d not found: ID does not exist" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.520640 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgblp\" (UniqueName: \"kubernetes.io/projected/36f8729b-61ef-4fd3-9819-31e21a9fa616-kube-api-access-fgblp\") on node \"crc\" DevicePath \"\"" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.520688 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36f8729b-61ef-4fd3-9819-31e21a9fa616-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.705557 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwpf4"] Oct 08 23:42:42 crc kubenswrapper[4834]: I1008 23:42:42.711181 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gwpf4"] Oct 08 23:42:43 crc kubenswrapper[4834]: I1008 23:42:43.572613 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36f8729b-61ef-4fd3-9819-31e21a9fa616" path="/var/lib/kubelet/pods/36f8729b-61ef-4fd3-9819-31e21a9fa616/volumes" Oct 08 23:42:47 crc kubenswrapper[4834]: I1008 23:42:47.026218 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:42:47 crc kubenswrapper[4834]: I1008 23:42:47.026996 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.175816 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xsplm"] Oct 08 23:42:48 crc kubenswrapper[4834]: E1008 23:42:48.176310 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36f8729b-61ef-4fd3-9819-31e21a9fa616" containerName="registry-server" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.176331 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="36f8729b-61ef-4fd3-9819-31e21a9fa616" containerName="registry-server" Oct 08 23:42:48 crc kubenswrapper[4834]: E1008 23:42:48.176357 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36f8729b-61ef-4fd3-9819-31e21a9fa616" containerName="extract-utilities" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.176370 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="36f8729b-61ef-4fd3-9819-31e21a9fa616" containerName="extract-utilities" Oct 08 23:42:48 crc kubenswrapper[4834]: E1008 23:42:48.176399 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36f8729b-61ef-4fd3-9819-31e21a9fa616" containerName="extract-content" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.176411 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="36f8729b-61ef-4fd3-9819-31e21a9fa616" containerName="extract-content" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.176736 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="36f8729b-61ef-4fd3-9819-31e21a9fa616" containerName="registry-server" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.178852 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.196239 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xsplm"] Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.218084 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxlwk\" (UniqueName: \"kubernetes.io/projected/a27934f1-fa93-4776-a8ce-75bf90e4638e-kube-api-access-qxlwk\") pod \"community-operators-xsplm\" (UID: \"a27934f1-fa93-4776-a8ce-75bf90e4638e\") " pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.218190 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a27934f1-fa93-4776-a8ce-75bf90e4638e-catalog-content\") pod \"community-operators-xsplm\" (UID: \"a27934f1-fa93-4776-a8ce-75bf90e4638e\") " pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.218298 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a27934f1-fa93-4776-a8ce-75bf90e4638e-utilities\") pod \"community-operators-xsplm\" (UID: \"a27934f1-fa93-4776-a8ce-75bf90e4638e\") " pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.319246 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxlwk\" (UniqueName: \"kubernetes.io/projected/a27934f1-fa93-4776-a8ce-75bf90e4638e-kube-api-access-qxlwk\") pod \"community-operators-xsplm\" (UID: \"a27934f1-fa93-4776-a8ce-75bf90e4638e\") " pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.319391 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a27934f1-fa93-4776-a8ce-75bf90e4638e-catalog-content\") pod \"community-operators-xsplm\" (UID: \"a27934f1-fa93-4776-a8ce-75bf90e4638e\") " pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.319507 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a27934f1-fa93-4776-a8ce-75bf90e4638e-utilities\") pod \"community-operators-xsplm\" (UID: \"a27934f1-fa93-4776-a8ce-75bf90e4638e\") " pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.320086 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a27934f1-fa93-4776-a8ce-75bf90e4638e-catalog-content\") pod \"community-operators-xsplm\" (UID: \"a27934f1-fa93-4776-a8ce-75bf90e4638e\") " pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.320306 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a27934f1-fa93-4776-a8ce-75bf90e4638e-utilities\") pod \"community-operators-xsplm\" (UID: \"a27934f1-fa93-4776-a8ce-75bf90e4638e\") " pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.339709 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxlwk\" (UniqueName: \"kubernetes.io/projected/a27934f1-fa93-4776-a8ce-75bf90e4638e-kube-api-access-qxlwk\") pod \"community-operators-xsplm\" (UID: \"a27934f1-fa93-4776-a8ce-75bf90e4638e\") " pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.509878 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:48 crc kubenswrapper[4834]: I1008 23:42:48.818127 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xsplm"] Oct 08 23:42:49 crc kubenswrapper[4834]: I1008 23:42:49.418601 4834 generic.go:334] "Generic (PLEG): container finished" podID="a27934f1-fa93-4776-a8ce-75bf90e4638e" containerID="55078b484aa7e6abeec9889dc51fd3734a3691d5262f2aad577c3810b5e31f9b" exitCode=0 Oct 08 23:42:49 crc kubenswrapper[4834]: I1008 23:42:49.418682 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsplm" event={"ID":"a27934f1-fa93-4776-a8ce-75bf90e4638e","Type":"ContainerDied","Data":"55078b484aa7e6abeec9889dc51fd3734a3691d5262f2aad577c3810b5e31f9b"} Oct 08 23:42:49 crc kubenswrapper[4834]: I1008 23:42:49.418766 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsplm" event={"ID":"a27934f1-fa93-4776-a8ce-75bf90e4638e","Type":"ContainerStarted","Data":"b8471f0d9bc2ac213d5105b156c20aeafb994f5767fdf1bf3cb3a52bf9a7e0b1"} Oct 08 23:42:50 crc kubenswrapper[4834]: I1008 23:42:50.444795 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsplm" event={"ID":"a27934f1-fa93-4776-a8ce-75bf90e4638e","Type":"ContainerStarted","Data":"74b02a6540b91555e2ad3b387c91072cffa2a2d52cef5d76826adb89dc520475"} Oct 08 23:42:51 crc kubenswrapper[4834]: I1008 23:42:51.456819 4834 generic.go:334] "Generic (PLEG): container finished" podID="a27934f1-fa93-4776-a8ce-75bf90e4638e" containerID="74b02a6540b91555e2ad3b387c91072cffa2a2d52cef5d76826adb89dc520475" exitCode=0 Oct 08 23:42:51 crc kubenswrapper[4834]: I1008 23:42:51.456915 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsplm" event={"ID":"a27934f1-fa93-4776-a8ce-75bf90e4638e","Type":"ContainerDied","Data":"74b02a6540b91555e2ad3b387c91072cffa2a2d52cef5d76826adb89dc520475"} Oct 08 23:42:52 crc kubenswrapper[4834]: I1008 23:42:52.472889 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsplm" event={"ID":"a27934f1-fa93-4776-a8ce-75bf90e4638e","Type":"ContainerStarted","Data":"18c52c805a6bcd20859e62481589067cbf1c2eb5f52194a51bb454448e7a3107"} Oct 08 23:42:52 crc kubenswrapper[4834]: I1008 23:42:52.504900 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xsplm" podStartSLOduration=2.047337758 podStartE2EDuration="4.504886518s" podCreationTimestamp="2025-10-08 23:42:48 +0000 UTC" firstStartedPulling="2025-10-08 23:42:49.42128206 +0000 UTC m=+4777.244166846" lastFinishedPulling="2025-10-08 23:42:51.87883081 +0000 UTC m=+4779.701715606" observedRunningTime="2025-10-08 23:42:52.501163706 +0000 UTC m=+4780.324048452" watchObservedRunningTime="2025-10-08 23:42:52.504886518 +0000 UTC m=+4780.327771264" Oct 08 23:42:58 crc kubenswrapper[4834]: I1008 23:42:58.510083 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:58 crc kubenswrapper[4834]: I1008 23:42:58.510809 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:58 crc kubenswrapper[4834]: I1008 23:42:58.588743 4834 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:58 crc kubenswrapper[4834]: I1008 23:42:58.664995 4834 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:42:59 crc kubenswrapper[4834]: I1008 23:42:59.838036 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xsplm"] Oct 08 23:43:00 crc kubenswrapper[4834]: I1008 23:43:00.563113 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xsplm" podUID="a27934f1-fa93-4776-a8ce-75bf90e4638e" containerName="registry-server" containerID="cri-o://18c52c805a6bcd20859e62481589067cbf1c2eb5f52194a51bb454448e7a3107" gracePeriod=2 Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.058652 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.228122 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxlwk\" (UniqueName: \"kubernetes.io/projected/a27934f1-fa93-4776-a8ce-75bf90e4638e-kube-api-access-qxlwk\") pod \"a27934f1-fa93-4776-a8ce-75bf90e4638e\" (UID: \"a27934f1-fa93-4776-a8ce-75bf90e4638e\") " Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.228317 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a27934f1-fa93-4776-a8ce-75bf90e4638e-utilities\") pod \"a27934f1-fa93-4776-a8ce-75bf90e4638e\" (UID: \"a27934f1-fa93-4776-a8ce-75bf90e4638e\") " Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.228392 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a27934f1-fa93-4776-a8ce-75bf90e4638e-catalog-content\") pod \"a27934f1-fa93-4776-a8ce-75bf90e4638e\" (UID: \"a27934f1-fa93-4776-a8ce-75bf90e4638e\") " Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.229636 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a27934f1-fa93-4776-a8ce-75bf90e4638e-utilities" (OuterVolumeSpecName: "utilities") pod "a27934f1-fa93-4776-a8ce-75bf90e4638e" (UID: "a27934f1-fa93-4776-a8ce-75bf90e4638e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.235445 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a27934f1-fa93-4776-a8ce-75bf90e4638e-kube-api-access-qxlwk" (OuterVolumeSpecName: "kube-api-access-qxlwk") pod "a27934f1-fa93-4776-a8ce-75bf90e4638e" (UID: "a27934f1-fa93-4776-a8ce-75bf90e4638e"). InnerVolumeSpecName "kube-api-access-qxlwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.301902 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a27934f1-fa93-4776-a8ce-75bf90e4638e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a27934f1-fa93-4776-a8ce-75bf90e4638e" (UID: "a27934f1-fa93-4776-a8ce-75bf90e4638e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.331067 4834 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a27934f1-fa93-4776-a8ce-75bf90e4638e-utilities\") on node \"crc\" DevicePath \"\"" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.331128 4834 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a27934f1-fa93-4776-a8ce-75bf90e4638e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.331222 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxlwk\" (UniqueName: \"kubernetes.io/projected/a27934f1-fa93-4776-a8ce-75bf90e4638e-kube-api-access-qxlwk\") on node \"crc\" DevicePath \"\"" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.577768 4834 generic.go:334] "Generic (PLEG): container finished" podID="a27934f1-fa93-4776-a8ce-75bf90e4638e" containerID="18c52c805a6bcd20859e62481589067cbf1c2eb5f52194a51bb454448e7a3107" exitCode=0 Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.577841 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsplm" event={"ID":"a27934f1-fa93-4776-a8ce-75bf90e4638e","Type":"ContainerDied","Data":"18c52c805a6bcd20859e62481589067cbf1c2eb5f52194a51bb454448e7a3107"} Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.577884 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xsplm" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.577925 4834 scope.go:117] "RemoveContainer" containerID="18c52c805a6bcd20859e62481589067cbf1c2eb5f52194a51bb454448e7a3107" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.577904 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xsplm" event={"ID":"a27934f1-fa93-4776-a8ce-75bf90e4638e","Type":"ContainerDied","Data":"b8471f0d9bc2ac213d5105b156c20aeafb994f5767fdf1bf3cb3a52bf9a7e0b1"} Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.630713 4834 scope.go:117] "RemoveContainer" containerID="74b02a6540b91555e2ad3b387c91072cffa2a2d52cef5d76826adb89dc520475" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.640260 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xsplm"] Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.646712 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xsplm"] Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.667582 4834 scope.go:117] "RemoveContainer" containerID="55078b484aa7e6abeec9889dc51fd3734a3691d5262f2aad577c3810b5e31f9b" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.696768 4834 scope.go:117] "RemoveContainer" containerID="18c52c805a6bcd20859e62481589067cbf1c2eb5f52194a51bb454448e7a3107" Oct 08 23:43:01 crc kubenswrapper[4834]: E1008 23:43:01.698504 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18c52c805a6bcd20859e62481589067cbf1c2eb5f52194a51bb454448e7a3107\": container with ID starting with 18c52c805a6bcd20859e62481589067cbf1c2eb5f52194a51bb454448e7a3107 not found: ID does not exist" containerID="18c52c805a6bcd20859e62481589067cbf1c2eb5f52194a51bb454448e7a3107" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.698600 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18c52c805a6bcd20859e62481589067cbf1c2eb5f52194a51bb454448e7a3107"} err="failed to get container status \"18c52c805a6bcd20859e62481589067cbf1c2eb5f52194a51bb454448e7a3107\": rpc error: code = NotFound desc = could not find container \"18c52c805a6bcd20859e62481589067cbf1c2eb5f52194a51bb454448e7a3107\": container with ID starting with 18c52c805a6bcd20859e62481589067cbf1c2eb5f52194a51bb454448e7a3107 not found: ID does not exist" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.698664 4834 scope.go:117] "RemoveContainer" containerID="74b02a6540b91555e2ad3b387c91072cffa2a2d52cef5d76826adb89dc520475" Oct 08 23:43:01 crc kubenswrapper[4834]: E1008 23:43:01.699287 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74b02a6540b91555e2ad3b387c91072cffa2a2d52cef5d76826adb89dc520475\": container with ID starting with 74b02a6540b91555e2ad3b387c91072cffa2a2d52cef5d76826adb89dc520475 not found: ID does not exist" containerID="74b02a6540b91555e2ad3b387c91072cffa2a2d52cef5d76826adb89dc520475" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.699403 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74b02a6540b91555e2ad3b387c91072cffa2a2d52cef5d76826adb89dc520475"} err="failed to get container status \"74b02a6540b91555e2ad3b387c91072cffa2a2d52cef5d76826adb89dc520475\": rpc error: code = NotFound desc = could not find container \"74b02a6540b91555e2ad3b387c91072cffa2a2d52cef5d76826adb89dc520475\": container with ID starting with 74b02a6540b91555e2ad3b387c91072cffa2a2d52cef5d76826adb89dc520475 not found: ID does not exist" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.699482 4834 scope.go:117] "RemoveContainer" containerID="55078b484aa7e6abeec9889dc51fd3734a3691d5262f2aad577c3810b5e31f9b" Oct 08 23:43:01 crc kubenswrapper[4834]: E1008 23:43:01.699941 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55078b484aa7e6abeec9889dc51fd3734a3691d5262f2aad577c3810b5e31f9b\": container with ID starting with 55078b484aa7e6abeec9889dc51fd3734a3691d5262f2aad577c3810b5e31f9b not found: ID does not exist" containerID="55078b484aa7e6abeec9889dc51fd3734a3691d5262f2aad577c3810b5e31f9b" Oct 08 23:43:01 crc kubenswrapper[4834]: I1008 23:43:01.700031 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55078b484aa7e6abeec9889dc51fd3734a3691d5262f2aad577c3810b5e31f9b"} err="failed to get container status \"55078b484aa7e6abeec9889dc51fd3734a3691d5262f2aad577c3810b5e31f9b\": rpc error: code = NotFound desc = could not find container \"55078b484aa7e6abeec9889dc51fd3734a3691d5262f2aad577c3810b5e31f9b\": container with ID starting with 55078b484aa7e6abeec9889dc51fd3734a3691d5262f2aad577c3810b5e31f9b not found: ID does not exist" Oct 08 23:43:03 crc kubenswrapper[4834]: I1008 23:43:03.571136 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a27934f1-fa93-4776-a8ce-75bf90e4638e" path="/var/lib/kubelet/pods/a27934f1-fa93-4776-a8ce-75bf90e4638e/volumes" Oct 08 23:43:17 crc kubenswrapper[4834]: I1008 23:43:17.025813 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:43:17 crc kubenswrapper[4834]: I1008 23:43:17.026769 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:43:47 crc kubenswrapper[4834]: I1008 23:43:47.025633 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:43:47 crc kubenswrapper[4834]: I1008 23:43:47.026883 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:43:47 crc kubenswrapper[4834]: I1008 23:43:47.026999 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 23:43:47 crc kubenswrapper[4834]: I1008 23:43:47.028768 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"276ba5956ea31e01e4d465b12ffad880ddc21751d64b35842ae87952c0e6e1f3"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 23:43:47 crc kubenswrapper[4834]: I1008 23:43:47.028890 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://276ba5956ea31e01e4d465b12ffad880ddc21751d64b35842ae87952c0e6e1f3" gracePeriod=600 Oct 08 23:43:48 crc kubenswrapper[4834]: I1008 23:43:48.030047 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="276ba5956ea31e01e4d465b12ffad880ddc21751d64b35842ae87952c0e6e1f3" exitCode=0 Oct 08 23:43:48 crc kubenswrapper[4834]: I1008 23:43:48.030133 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"276ba5956ea31e01e4d465b12ffad880ddc21751d64b35842ae87952c0e6e1f3"} Oct 08 23:43:48 crc kubenswrapper[4834]: I1008 23:43:48.030698 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerStarted","Data":"36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd"} Oct 08 23:43:48 crc kubenswrapper[4834]: I1008 23:43:48.030736 4834 scope.go:117] "RemoveContainer" containerID="c3c1beda42090046820939e14de17180d59262097d9cb47caf3543d34f215f35" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.168992 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qbpqz/must-gather-mlvts"] Oct 08 23:44:52 crc kubenswrapper[4834]: E1008 23:44:52.169708 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27934f1-fa93-4776-a8ce-75bf90e4638e" containerName="registry-server" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.169720 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27934f1-fa93-4776-a8ce-75bf90e4638e" containerName="registry-server" Oct 08 23:44:52 crc kubenswrapper[4834]: E1008 23:44:52.169741 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27934f1-fa93-4776-a8ce-75bf90e4638e" containerName="extract-content" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.169747 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27934f1-fa93-4776-a8ce-75bf90e4638e" containerName="extract-content" Oct 08 23:44:52 crc kubenswrapper[4834]: E1008 23:44:52.169780 4834 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27934f1-fa93-4776-a8ce-75bf90e4638e" containerName="extract-utilities" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.169787 4834 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27934f1-fa93-4776-a8ce-75bf90e4638e" containerName="extract-utilities" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.169917 4834 memory_manager.go:354] "RemoveStaleState removing state" podUID="a27934f1-fa93-4776-a8ce-75bf90e4638e" containerName="registry-server" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.170598 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbpqz/must-gather-mlvts" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.173527 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qbpqz"/"openshift-service-ca.crt" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.173930 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qbpqz"/"default-dockercfg-hxbh8" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.175216 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qbpqz"/"kube-root-ca.crt" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.176030 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qbpqz/must-gather-mlvts"] Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.288232 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjhnv\" (UniqueName: \"kubernetes.io/projected/c91f0734-c31f-4220-b58b-c3b8d9cf18e9-kube-api-access-bjhnv\") pod \"must-gather-mlvts\" (UID: \"c91f0734-c31f-4220-b58b-c3b8d9cf18e9\") " pod="openshift-must-gather-qbpqz/must-gather-mlvts" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.288349 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c91f0734-c31f-4220-b58b-c3b8d9cf18e9-must-gather-output\") pod \"must-gather-mlvts\" (UID: \"c91f0734-c31f-4220-b58b-c3b8d9cf18e9\") " pod="openshift-must-gather-qbpqz/must-gather-mlvts" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.389860 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjhnv\" (UniqueName: \"kubernetes.io/projected/c91f0734-c31f-4220-b58b-c3b8d9cf18e9-kube-api-access-bjhnv\") pod \"must-gather-mlvts\" (UID: \"c91f0734-c31f-4220-b58b-c3b8d9cf18e9\") " pod="openshift-must-gather-qbpqz/must-gather-mlvts" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.389931 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c91f0734-c31f-4220-b58b-c3b8d9cf18e9-must-gather-output\") pod \"must-gather-mlvts\" (UID: \"c91f0734-c31f-4220-b58b-c3b8d9cf18e9\") " pod="openshift-must-gather-qbpqz/must-gather-mlvts" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.390322 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c91f0734-c31f-4220-b58b-c3b8d9cf18e9-must-gather-output\") pod \"must-gather-mlvts\" (UID: \"c91f0734-c31f-4220-b58b-c3b8d9cf18e9\") " pod="openshift-must-gather-qbpqz/must-gather-mlvts" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.412592 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjhnv\" (UniqueName: \"kubernetes.io/projected/c91f0734-c31f-4220-b58b-c3b8d9cf18e9-kube-api-access-bjhnv\") pod \"must-gather-mlvts\" (UID: \"c91f0734-c31f-4220-b58b-c3b8d9cf18e9\") " pod="openshift-must-gather-qbpqz/must-gather-mlvts" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.485964 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbpqz/must-gather-mlvts" Oct 08 23:44:52 crc kubenswrapper[4834]: I1008 23:44:52.962134 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qbpqz/must-gather-mlvts"] Oct 08 23:44:53 crc kubenswrapper[4834]: I1008 23:44:53.731861 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbpqz/must-gather-mlvts" event={"ID":"c91f0734-c31f-4220-b58b-c3b8d9cf18e9","Type":"ContainerStarted","Data":"bbcf6afa4579c8d4d92596da786abc90f40f5f2f8881cfa4738ab594a2368ed8"} Oct 08 23:44:57 crc kubenswrapper[4834]: I1008 23:44:57.764452 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbpqz/must-gather-mlvts" event={"ID":"c91f0734-c31f-4220-b58b-c3b8d9cf18e9","Type":"ContainerStarted","Data":"61f07e9d0ac79819d100b0054d8b842996edd56953423e025ef74becc9ab9424"} Oct 08 23:44:58 crc kubenswrapper[4834]: I1008 23:44:58.771450 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbpqz/must-gather-mlvts" event={"ID":"c91f0734-c31f-4220-b58b-c3b8d9cf18e9","Type":"ContainerStarted","Data":"b714ef58b7c531bf2601afd0fce1b3340b0ed77e01b2a90701b95d0ad62d18d0"} Oct 08 23:44:58 crc kubenswrapper[4834]: I1008 23:44:58.789262 4834 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qbpqz/must-gather-mlvts" podStartSLOduration=2.897930654 podStartE2EDuration="6.789247948s" podCreationTimestamp="2025-10-08 23:44:52 +0000 UTC" firstStartedPulling="2025-10-08 23:44:52.972758318 +0000 UTC m=+4900.795643064" lastFinishedPulling="2025-10-08 23:44:56.864075572 +0000 UTC m=+4904.686960358" observedRunningTime="2025-10-08 23:44:58.789237468 +0000 UTC m=+4906.612122214" watchObservedRunningTime="2025-10-08 23:44:58.789247948 +0000 UTC m=+4906.612132694" Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.144009 4834 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn"] Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.145200 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.147139 4834 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.147712 4834 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.158269 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn"] Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.307364 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a08a62ab-babd-4145-9211-c7819c4b3b22-secret-volume\") pod \"collect-profiles-29332785-tl6pn\" (UID: \"a08a62ab-babd-4145-9211-c7819c4b3b22\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.307420 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a08a62ab-babd-4145-9211-c7819c4b3b22-config-volume\") pod \"collect-profiles-29332785-tl6pn\" (UID: \"a08a62ab-babd-4145-9211-c7819c4b3b22\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.307686 4834 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z68kw\" (UniqueName: \"kubernetes.io/projected/a08a62ab-babd-4145-9211-c7819c4b3b22-kube-api-access-z68kw\") pod \"collect-profiles-29332785-tl6pn\" (UID: \"a08a62ab-babd-4145-9211-c7819c4b3b22\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.408847 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a08a62ab-babd-4145-9211-c7819c4b3b22-secret-volume\") pod \"collect-profiles-29332785-tl6pn\" (UID: \"a08a62ab-babd-4145-9211-c7819c4b3b22\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.408927 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a08a62ab-babd-4145-9211-c7819c4b3b22-config-volume\") pod \"collect-profiles-29332785-tl6pn\" (UID: \"a08a62ab-babd-4145-9211-c7819c4b3b22\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.408989 4834 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z68kw\" (UniqueName: \"kubernetes.io/projected/a08a62ab-babd-4145-9211-c7819c4b3b22-kube-api-access-z68kw\") pod \"collect-profiles-29332785-tl6pn\" (UID: \"a08a62ab-babd-4145-9211-c7819c4b3b22\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.410026 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a08a62ab-babd-4145-9211-c7819c4b3b22-config-volume\") pod \"collect-profiles-29332785-tl6pn\" (UID: \"a08a62ab-babd-4145-9211-c7819c4b3b22\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.415450 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a08a62ab-babd-4145-9211-c7819c4b3b22-secret-volume\") pod \"collect-profiles-29332785-tl6pn\" (UID: \"a08a62ab-babd-4145-9211-c7819c4b3b22\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.435737 4834 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z68kw\" (UniqueName: \"kubernetes.io/projected/a08a62ab-babd-4145-9211-c7819c4b3b22-kube-api-access-z68kw\") pod \"collect-profiles-29332785-tl6pn\" (UID: \"a08a62ab-babd-4145-9211-c7819c4b3b22\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.515503 4834 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" Oct 08 23:45:00 crc kubenswrapper[4834]: I1008 23:45:00.994383 4834 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn"] Oct 08 23:45:01 crc kubenswrapper[4834]: I1008 23:45:01.791793 4834 generic.go:334] "Generic (PLEG): container finished" podID="a08a62ab-babd-4145-9211-c7819c4b3b22" containerID="3b75cc4c071a7f89263795d145bffa549ce8f8a2893fa19e9aa3c9005c9f4950" exitCode=0 Oct 08 23:45:01 crc kubenswrapper[4834]: I1008 23:45:01.791885 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" event={"ID":"a08a62ab-babd-4145-9211-c7819c4b3b22","Type":"ContainerDied","Data":"3b75cc4c071a7f89263795d145bffa549ce8f8a2893fa19e9aa3c9005c9f4950"} Oct 08 23:45:01 crc kubenswrapper[4834]: I1008 23:45:01.792064 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" event={"ID":"a08a62ab-babd-4145-9211-c7819c4b3b22","Type":"ContainerStarted","Data":"8b528836ee6a06ffc710635919347ede8c6576e86cae8dbb624b5d4376c7536b"} Oct 08 23:45:03 crc kubenswrapper[4834]: I1008 23:45:03.026923 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" Oct 08 23:45:03 crc kubenswrapper[4834]: I1008 23:45:03.148487 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z68kw\" (UniqueName: \"kubernetes.io/projected/a08a62ab-babd-4145-9211-c7819c4b3b22-kube-api-access-z68kw\") pod \"a08a62ab-babd-4145-9211-c7819c4b3b22\" (UID: \"a08a62ab-babd-4145-9211-c7819c4b3b22\") " Oct 08 23:45:03 crc kubenswrapper[4834]: I1008 23:45:03.148574 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a08a62ab-babd-4145-9211-c7819c4b3b22-config-volume\") pod \"a08a62ab-babd-4145-9211-c7819c4b3b22\" (UID: \"a08a62ab-babd-4145-9211-c7819c4b3b22\") " Oct 08 23:45:03 crc kubenswrapper[4834]: I1008 23:45:03.148601 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a08a62ab-babd-4145-9211-c7819c4b3b22-secret-volume\") pod \"a08a62ab-babd-4145-9211-c7819c4b3b22\" (UID: \"a08a62ab-babd-4145-9211-c7819c4b3b22\") " Oct 08 23:45:03 crc kubenswrapper[4834]: I1008 23:45:03.149260 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a08a62ab-babd-4145-9211-c7819c4b3b22-config-volume" (OuterVolumeSpecName: "config-volume") pod "a08a62ab-babd-4145-9211-c7819c4b3b22" (UID: "a08a62ab-babd-4145-9211-c7819c4b3b22"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 08 23:45:03 crc kubenswrapper[4834]: I1008 23:45:03.153593 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08a62ab-babd-4145-9211-c7819c4b3b22-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a08a62ab-babd-4145-9211-c7819c4b3b22" (UID: "a08a62ab-babd-4145-9211-c7819c4b3b22"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 08 23:45:03 crc kubenswrapper[4834]: I1008 23:45:03.156032 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08a62ab-babd-4145-9211-c7819c4b3b22-kube-api-access-z68kw" (OuterVolumeSpecName: "kube-api-access-z68kw") pod "a08a62ab-babd-4145-9211-c7819c4b3b22" (UID: "a08a62ab-babd-4145-9211-c7819c4b3b22"). InnerVolumeSpecName "kube-api-access-z68kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:45:03 crc kubenswrapper[4834]: I1008 23:45:03.249747 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z68kw\" (UniqueName: \"kubernetes.io/projected/a08a62ab-babd-4145-9211-c7819c4b3b22-kube-api-access-z68kw\") on node \"crc\" DevicePath \"\"" Oct 08 23:45:03 crc kubenswrapper[4834]: I1008 23:45:03.249788 4834 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a08a62ab-babd-4145-9211-c7819c4b3b22-config-volume\") on node \"crc\" DevicePath \"\"" Oct 08 23:45:03 crc kubenswrapper[4834]: I1008 23:45:03.249802 4834 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a08a62ab-babd-4145-9211-c7819c4b3b22-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 08 23:45:03 crc kubenswrapper[4834]: I1008 23:45:03.807197 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" event={"ID":"a08a62ab-babd-4145-9211-c7819c4b3b22","Type":"ContainerDied","Data":"8b528836ee6a06ffc710635919347ede8c6576e86cae8dbb624b5d4376c7536b"} Oct 08 23:45:03 crc kubenswrapper[4834]: I1008 23:45:03.807245 4834 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b528836ee6a06ffc710635919347ede8c6576e86cae8dbb624b5d4376c7536b" Oct 08 23:45:03 crc kubenswrapper[4834]: I1008 23:45:03.807302 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29332785-tl6pn" Oct 08 23:45:04 crc kubenswrapper[4834]: I1008 23:45:04.089283 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp"] Oct 08 23:45:04 crc kubenswrapper[4834]: I1008 23:45:04.093744 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29332740-7f6kp"] Oct 08 23:45:05 crc kubenswrapper[4834]: I1008 23:45:05.572606 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ca2c493-23f8-42c9-be1e-968ebe02de13" path="/var/lib/kubelet/pods/6ca2c493-23f8-42c9-be1e-968ebe02de13/volumes" Oct 08 23:45:23 crc kubenswrapper[4834]: I1008 23:45:23.505859 4834 scope.go:117] "RemoveContainer" containerID="44439a7551b735867dceb42bf54e6fe469833e1462650e3425630a21dbd2b055" Oct 08 23:45:47 crc kubenswrapper[4834]: I1008 23:45:47.025983 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:45:47 crc kubenswrapper[4834]: I1008 23:45:47.026506 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:45:55 crc kubenswrapper[4834]: I1008 23:45:55.324424 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh_1d0326a6-118b-492f-98fd-139f5b8fdcf0/util/0.log" Oct 08 23:45:55 crc kubenswrapper[4834]: I1008 23:45:55.485675 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh_1d0326a6-118b-492f-98fd-139f5b8fdcf0/pull/0.log" Oct 08 23:45:55 crc kubenswrapper[4834]: I1008 23:45:55.499052 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh_1d0326a6-118b-492f-98fd-139f5b8fdcf0/pull/0.log" Oct 08 23:45:55 crc kubenswrapper[4834]: I1008 23:45:55.518624 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh_1d0326a6-118b-492f-98fd-139f5b8fdcf0/util/0.log" Oct 08 23:45:55 crc kubenswrapper[4834]: I1008 23:45:55.710267 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh_1d0326a6-118b-492f-98fd-139f5b8fdcf0/pull/0.log" Oct 08 23:45:55 crc kubenswrapper[4834]: I1008 23:45:55.722260 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh_1d0326a6-118b-492f-98fd-139f5b8fdcf0/util/0.log" Oct 08 23:45:55 crc kubenswrapper[4834]: I1008 23:45:55.750738 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_184bca519f21fd0fd55ec08aa4c93472bde537c6b6fd14be86cecc7c99792fh_1d0326a6-118b-492f-98fd-139f5b8fdcf0/extract/0.log" Oct 08 23:45:55 crc kubenswrapper[4834]: I1008 23:45:55.898592 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-dvbzh_864591d4-af96-44e6-8a1f-a01bf0b9fb44/kube-rbac-proxy/0.log" Oct 08 23:45:55 crc kubenswrapper[4834]: I1008 23:45:55.965083 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-dvbzh_864591d4-af96-44e6-8a1f-a01bf0b9fb44/manager/0.log" Oct 08 23:45:55 crc kubenswrapper[4834]: I1008 23:45:55.990650 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-8fbc8_6dbd0034-b992-4a64-ab92-268abe380d03/kube-rbac-proxy/0.log" Oct 08 23:45:56 crc kubenswrapper[4834]: I1008 23:45:56.095587 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-8fbc8_6dbd0034-b992-4a64-ab92-268abe380d03/manager/0.log" Oct 08 23:45:56 crc kubenswrapper[4834]: I1008 23:45:56.157006 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-8fhdd_775cc910-4ea9-4da1-b35a-a31b4c880010/kube-rbac-proxy/0.log" Oct 08 23:45:56 crc kubenswrapper[4834]: I1008 23:45:56.185968 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-8fhdd_775cc910-4ea9-4da1-b35a-a31b4c880010/manager/0.log" Oct 08 23:45:56 crc kubenswrapper[4834]: I1008 23:45:56.277098 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-nf4vz_4f3521e9-05df-408a-a765-7a7ba0046afa/kube-rbac-proxy/0.log" Oct 08 23:45:56 crc kubenswrapper[4834]: I1008 23:45:56.449045 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-nf4vz_4f3521e9-05df-408a-a765-7a7ba0046afa/manager/0.log" Oct 08 23:45:56 crc kubenswrapper[4834]: I1008 23:45:56.469027 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-xprwf_b886f9b3-c296-4445-9a60-cb6809463741/manager/0.log" Oct 08 23:45:56 crc kubenswrapper[4834]: I1008 23:45:56.479696 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-xprwf_b886f9b3-c296-4445-9a60-cb6809463741/kube-rbac-proxy/0.log" Oct 08 23:45:56 crc kubenswrapper[4834]: I1008 23:45:56.602193 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-lwsbk_5d7e03d9-baa9-4867-9fbc-91a82a36f4e2/kube-rbac-proxy/0.log" Oct 08 23:45:56 crc kubenswrapper[4834]: I1008 23:45:56.631576 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-lwsbk_5d7e03d9-baa9-4867-9fbc-91a82a36f4e2/manager/0.log" Oct 08 23:45:56 crc kubenswrapper[4834]: I1008 23:45:56.767389 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-mxst7_baa45e46-72a6-4f2f-af9e-ce679038b8f1/kube-rbac-proxy/0.log" Oct 08 23:45:56 crc kubenswrapper[4834]: I1008 23:45:56.877884 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-ssskr_84427f76-6342-4e6b-9875-56b2d3db0fac/kube-rbac-proxy/0.log" Oct 08 23:45:56 crc kubenswrapper[4834]: I1008 23:45:56.893310 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-mxst7_baa45e46-72a6-4f2f-af9e-ce679038b8f1/manager/0.log" Oct 08 23:45:56 crc kubenswrapper[4834]: I1008 23:45:56.946786 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-ssskr_84427f76-6342-4e6b-9875-56b2d3db0fac/manager/0.log" Oct 08 23:45:57 crc kubenswrapper[4834]: I1008 23:45:57.033585 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-z5gzx_306dc2e6-3f9b-45c6-b615-75a6d2098857/kube-rbac-proxy/0.log" Oct 08 23:45:57 crc kubenswrapper[4834]: I1008 23:45:57.149993 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-z5gzx_306dc2e6-3f9b-45c6-b615-75a6d2098857/manager/0.log" Oct 08 23:45:57 crc kubenswrapper[4834]: I1008 23:45:57.238026 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-q2qld_5452435d-906d-4f08-87e9-168187eb4d5c/manager/0.log" Oct 08 23:45:57 crc kubenswrapper[4834]: I1008 23:45:57.267866 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-q2qld_5452435d-906d-4f08-87e9-168187eb4d5c/kube-rbac-proxy/0.log" Oct 08 23:45:57 crc kubenswrapper[4834]: I1008 23:45:57.421800 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-7mh2r_cf7ce4de-5f9c-47c3-a01b-2e23e1e27a82/manager/0.log" Oct 08 23:45:57 crc kubenswrapper[4834]: I1008 23:45:57.451832 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-7mh2r_cf7ce4de-5f9c-47c3-a01b-2e23e1e27a82/kube-rbac-proxy/0.log" Oct 08 23:45:57 crc kubenswrapper[4834]: I1008 23:45:57.560511 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-jkmf4_0cdb778f-1017-4f86-9443-85d7bf158bd6/kube-rbac-proxy/0.log" Oct 08 23:45:57 crc kubenswrapper[4834]: I1008 23:45:57.659551 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-rw77z_6a282f5d-5a52-4a80-bf82-6eea47007564/kube-rbac-proxy/0.log" Oct 08 23:45:57 crc kubenswrapper[4834]: I1008 23:45:57.704298 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-jkmf4_0cdb778f-1017-4f86-9443-85d7bf158bd6/manager/0.log" Oct 08 23:45:57 crc kubenswrapper[4834]: I1008 23:45:57.873631 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-rw77z_6a282f5d-5a52-4a80-bf82-6eea47007564/manager/0.log" Oct 08 23:45:57 crc kubenswrapper[4834]: I1008 23:45:57.882410 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-zqxbk_f8aea845-c4c0-469f-ac39-9f4525e69ec5/kube-rbac-proxy/0.log" Oct 08 23:45:57 crc kubenswrapper[4834]: I1008 23:45:57.958976 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-zqxbk_f8aea845-c4c0-469f-ac39-9f4525e69ec5/manager/0.log" Oct 08 23:45:58 crc kubenswrapper[4834]: I1008 23:45:58.072282 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm_96bb0f12-b144-4f24-9b48-407519d51c6e/manager/0.log" Oct 08 23:45:58 crc kubenswrapper[4834]: I1008 23:45:58.093737 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-677c5f5bffgbrnm_96bb0f12-b144-4f24-9b48-407519d51c6e/kube-rbac-proxy/0.log" Oct 08 23:45:58 crc kubenswrapper[4834]: I1008 23:45:58.193719 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7d6957655c-ggfj6_00f8a5ab-e561-4b67-a56e-791342c7dbb4/kube-rbac-proxy/0.log" Oct 08 23:45:58 crc kubenswrapper[4834]: I1008 23:45:58.287335 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-848c57cb5c-dclqm_3c4ac3ad-ab59-48d1-bfcd-ced9b96a5f8a/kube-rbac-proxy/0.log" Oct 08 23:45:58 crc kubenswrapper[4834]: I1008 23:45:58.542054 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-fgcvd_973cf5c7-1209-49ad-bb4e-02b88d9d2df4/registry-server/0.log" Oct 08 23:45:58 crc kubenswrapper[4834]: I1008 23:45:58.562686 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-848c57cb5c-dclqm_3c4ac3ad-ab59-48d1-bfcd-ced9b96a5f8a/operator/0.log" Oct 08 23:45:58 crc kubenswrapper[4834]: I1008 23:45:58.724019 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79db49b9fb-q9bmt_d062b1e7-96b9-48e0-acb3-528dc3c7e59c/kube-rbac-proxy/0.log" Oct 08 23:45:58 crc kubenswrapper[4834]: I1008 23:45:58.766429 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79db49b9fb-q9bmt_d062b1e7-96b9-48e0-acb3-528dc3c7e59c/manager/0.log" Oct 08 23:45:58 crc kubenswrapper[4834]: I1008 23:45:58.780076 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-6r5r9_3ffe8d9e-bb57-4156-8eff-fdb070a67e6d/kube-rbac-proxy/0.log" Oct 08 23:45:58 crc kubenswrapper[4834]: I1008 23:45:58.917914 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-6r5r9_3ffe8d9e-bb57-4156-8eff-fdb070a67e6d/manager/0.log" Oct 08 23:45:59 crc kubenswrapper[4834]: I1008 23:45:59.000890 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7d6957655c-ggfj6_00f8a5ab-e561-4b67-a56e-791342c7dbb4/manager/0.log" Oct 08 23:45:59 crc kubenswrapper[4834]: I1008 23:45:59.025172 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-5c2bw_0f36bacd-60f1-41f9-a0e9-1429cecade32/operator/0.log" Oct 08 23:45:59 crc kubenswrapper[4834]: I1008 23:45:59.093462 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-vxg74_95926917-39a4-4757-b246-874a680f97ce/kube-rbac-proxy/0.log" Oct 08 23:45:59 crc kubenswrapper[4834]: I1008 23:45:59.132976 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-vxg74_95926917-39a4-4757-b246-874a680f97ce/manager/0.log" Oct 08 23:45:59 crc kubenswrapper[4834]: I1008 23:45:59.195578 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-7lzg8_6c2fa94f-b7fc-496b-a2f4-81695f1d86b2/kube-rbac-proxy/0.log" Oct 08 23:45:59 crc kubenswrapper[4834]: I1008 23:45:59.240838 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-7lzg8_6c2fa94f-b7fc-496b-a2f4-81695f1d86b2/manager/0.log" Oct 08 23:45:59 crc kubenswrapper[4834]: I1008 23:45:59.313011 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-phcvn_8e0f70ea-9d3e-4ada-834a-1134f8485204/kube-rbac-proxy/0.log" Oct 08 23:45:59 crc kubenswrapper[4834]: I1008 23:45:59.394931 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-phcvn_8e0f70ea-9d3e-4ada-834a-1134f8485204/manager/0.log" Oct 08 23:45:59 crc kubenswrapper[4834]: I1008 23:45:59.418479 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-dkc8b_3286b799-dca1-495d-b329-9c46d97e024b/kube-rbac-proxy/0.log" Oct 08 23:45:59 crc kubenswrapper[4834]: I1008 23:45:59.480482 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-dkc8b_3286b799-dca1-495d-b329-9c46d97e024b/manager/0.log" Oct 08 23:46:17 crc kubenswrapper[4834]: I1008 23:46:17.025651 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:46:17 crc kubenswrapper[4834]: I1008 23:46:17.026411 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:46:17 crc kubenswrapper[4834]: I1008 23:46:17.483601 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-m5dj6_f5fbb5d9-094a-4948-a1f2-2f2b84bae26d/control-plane-machine-set-operator/0.log" Oct 08 23:46:17 crc kubenswrapper[4834]: I1008 23:46:17.614801 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-66fwq_b42878da-c62e-425b-b147-57836dcd9a2d/kube-rbac-proxy/0.log" Oct 08 23:46:17 crc kubenswrapper[4834]: I1008 23:46:17.714510 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-66fwq_b42878da-c62e-425b-b147-57836dcd9a2d/machine-api-operator/0.log" Oct 08 23:46:31 crc kubenswrapper[4834]: I1008 23:46:31.188366 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-g5vrh_f01ea414-13bf-4228-a97f-32f5810dfd5b/cert-manager-controller/0.log" Oct 08 23:46:31 crc kubenswrapper[4834]: I1008 23:46:31.344391 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-lr8nd_019cf26b-39c2-4ee8-b93f-6cdb0c3310cd/cert-manager-cainjector/0.log" Oct 08 23:46:31 crc kubenswrapper[4834]: I1008 23:46:31.404997 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-5qfsl_3ebaa304-2d20-49a0-8c2d-46c1f53e94bb/cert-manager-webhook/0.log" Oct 08 23:46:45 crc kubenswrapper[4834]: I1008 23:46:45.724664 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-g2lcj_9b787c76-801d-4aa7-ad57-24eb4e4a1232/nmstate-console-plugin/0.log" Oct 08 23:46:45 crc kubenswrapper[4834]: I1008 23:46:45.819692 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7nbqk_199673d0-23b5-446b-a8c0-f7e49043acc8/nmstate-handler/0.log" Oct 08 23:46:45 crc kubenswrapper[4834]: I1008 23:46:45.908981 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-phk7h_ab078381-fb7e-4f3b-959e-bca22362c6bc/kube-rbac-proxy/0.log" Oct 08 23:46:45 crc kubenswrapper[4834]: I1008 23:46:45.994844 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-phk7h_ab078381-fb7e-4f3b-959e-bca22362c6bc/nmstate-metrics/0.log" Oct 08 23:46:46 crc kubenswrapper[4834]: I1008 23:46:46.099414 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-9hlt4_8d63afa4-5c31-460c-9409-4752cbe62b7b/nmstate-operator/0.log" Oct 08 23:46:46 crc kubenswrapper[4834]: I1008 23:46:46.171788 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-dmfct_421a6435-345f-4452-a180-93038948456a/nmstate-webhook/0.log" Oct 08 23:46:47 crc kubenswrapper[4834]: I1008 23:46:47.026085 4834 patch_prober.go:28] interesting pod/machine-config-daemon-f9m4z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 08 23:46:47 crc kubenswrapper[4834]: I1008 23:46:47.026185 4834 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 08 23:46:47 crc kubenswrapper[4834]: I1008 23:46:47.026242 4834 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" Oct 08 23:46:47 crc kubenswrapper[4834]: I1008 23:46:47.026994 4834 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd"} pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 08 23:46:47 crc kubenswrapper[4834]: I1008 23:46:47.027057 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" containerName="machine-config-daemon" containerID="cri-o://36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" gracePeriod=600 Oct 08 23:46:47 crc kubenswrapper[4834]: E1008 23:46:47.146604 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:46:47 crc kubenswrapper[4834]: I1008 23:46:47.692116 4834 generic.go:334] "Generic (PLEG): container finished" podID="732cf917-b3ec-4649-99b0-66653902cfc2" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" exitCode=0 Oct 08 23:46:47 crc kubenswrapper[4834]: I1008 23:46:47.692329 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" event={"ID":"732cf917-b3ec-4649-99b0-66653902cfc2","Type":"ContainerDied","Data":"36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd"} Oct 08 23:46:47 crc kubenswrapper[4834]: I1008 23:46:47.692580 4834 scope.go:117] "RemoveContainer" containerID="276ba5956ea31e01e4d465b12ffad880ddc21751d64b35842ae87952c0e6e1f3" Oct 08 23:46:47 crc kubenswrapper[4834]: I1008 23:46:47.693413 4834 scope.go:117] "RemoveContainer" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" Oct 08 23:46:47 crc kubenswrapper[4834]: E1008 23:46:47.693834 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:47:00 crc kubenswrapper[4834]: I1008 23:47:00.742713 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-g8nxg_c01f98dd-783e-431d-a695-053b316e9c60/kube-rbac-proxy/0.log" Oct 08 23:47:00 crc kubenswrapper[4834]: I1008 23:47:00.979721 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/cp-frr-files/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.143243 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/cp-reloader/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.162170 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-g8nxg_c01f98dd-783e-431d-a695-053b316e9c60/controller/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.193932 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/cp-frr-files/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.200767 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/cp-metrics/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.311181 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/cp-reloader/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.454640 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/cp-reloader/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.476546 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/cp-frr-files/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.478049 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/cp-metrics/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.512211 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/cp-metrics/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.679962 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/cp-metrics/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.689252 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/cp-frr-files/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.689563 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/cp-reloader/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.710321 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/controller/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.858690 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/frr-metrics/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.887791 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/kube-rbac-proxy/0.log" Oct 08 23:47:01 crc kubenswrapper[4834]: I1008 23:47:01.941942 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/kube-rbac-proxy-frr/0.log" Oct 08 23:47:02 crc kubenswrapper[4834]: I1008 23:47:02.026614 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/reloader/0.log" Oct 08 23:47:02 crc kubenswrapper[4834]: I1008 23:47:02.120853 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-czqvx_ca858c41-1390-4fcb-84a9-07b0482b6996/frr-k8s-webhook-server/0.log" Oct 08 23:47:02 crc kubenswrapper[4834]: I1008 23:47:02.265108 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-64c74dd74f-st2xs_3ad77aef-6ec4-4902-b24e-64599745e983/manager/0.log" Oct 08 23:47:02 crc kubenswrapper[4834]: I1008 23:47:02.428284 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7ffb9d7cb9-tz7qq_e5afee87-da07-475a-b94a-8a473a64be9b/webhook-server/0.log" Oct 08 23:47:02 crc kubenswrapper[4834]: I1008 23:47:02.559544 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-n8v9d_53b63e43-0ee5-4d6c-b029-5d24b2d5aa96/kube-rbac-proxy/0.log" Oct 08 23:47:03 crc kubenswrapper[4834]: I1008 23:47:03.115333 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-n8v9d_53b63e43-0ee5-4d6c-b029-5d24b2d5aa96/speaker/0.log" Oct 08 23:47:03 crc kubenswrapper[4834]: I1008 23:47:03.258957 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7mpst_e5e9deb8-fb15-4cfb-8104-52006098ee11/frr/0.log" Oct 08 23:47:03 crc kubenswrapper[4834]: I1008 23:47:03.565082 4834 scope.go:117] "RemoveContainer" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" Oct 08 23:47:03 crc kubenswrapper[4834]: E1008 23:47:03.565560 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:47:14 crc kubenswrapper[4834]: I1008 23:47:14.556051 4834 scope.go:117] "RemoveContainer" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" Oct 08 23:47:14 crc kubenswrapper[4834]: E1008 23:47:14.557197 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:47:16 crc kubenswrapper[4834]: I1008 23:47:16.513651 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r_ef56eccf-0b07-4f8d-b7fd-6f9c025fc856/util/0.log" Oct 08 23:47:16 crc kubenswrapper[4834]: I1008 23:47:16.641560 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r_ef56eccf-0b07-4f8d-b7fd-6f9c025fc856/util/0.log" Oct 08 23:47:16 crc kubenswrapper[4834]: I1008 23:47:16.663180 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r_ef56eccf-0b07-4f8d-b7fd-6f9c025fc856/pull/0.log" Oct 08 23:47:16 crc kubenswrapper[4834]: I1008 23:47:16.729090 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r_ef56eccf-0b07-4f8d-b7fd-6f9c025fc856/pull/0.log" Oct 08 23:47:16 crc kubenswrapper[4834]: I1008 23:47:16.803521 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r_ef56eccf-0b07-4f8d-b7fd-6f9c025fc856/util/0.log" Oct 08 23:47:16 crc kubenswrapper[4834]: I1008 23:47:16.843443 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r_ef56eccf-0b07-4f8d-b7fd-6f9c025fc856/pull/0.log" Oct 08 23:47:16 crc kubenswrapper[4834]: I1008 23:47:16.851768 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69l2z9r_ef56eccf-0b07-4f8d-b7fd-6f9c025fc856/extract/0.log" Oct 08 23:47:16 crc kubenswrapper[4834]: I1008 23:47:16.969603 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q_f7d60134-60c3-498e-9550-fafb7900fcf1/util/0.log" Oct 08 23:47:17 crc kubenswrapper[4834]: I1008 23:47:17.111740 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q_f7d60134-60c3-498e-9550-fafb7900fcf1/util/0.log" Oct 08 23:47:17 crc kubenswrapper[4834]: I1008 23:47:17.120764 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q_f7d60134-60c3-498e-9550-fafb7900fcf1/pull/0.log" Oct 08 23:47:17 crc kubenswrapper[4834]: I1008 23:47:17.164415 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q_f7d60134-60c3-498e-9550-fafb7900fcf1/pull/0.log" Oct 08 23:47:17 crc kubenswrapper[4834]: I1008 23:47:17.316063 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q_f7d60134-60c3-498e-9550-fafb7900fcf1/pull/0.log" Oct 08 23:47:17 crc kubenswrapper[4834]: I1008 23:47:17.348982 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q_f7d60134-60c3-498e-9550-fafb7900fcf1/util/0.log" Oct 08 23:47:17 crc kubenswrapper[4834]: I1008 23:47:17.373861 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2bg29q_f7d60134-60c3-498e-9550-fafb7900fcf1/extract/0.log" Oct 08 23:47:17 crc kubenswrapper[4834]: I1008 23:47:17.497205 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npg7j_f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe/extract-utilities/0.log" Oct 08 23:47:17 crc kubenswrapper[4834]: I1008 23:47:17.679838 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npg7j_f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe/extract-utilities/0.log" Oct 08 23:47:17 crc kubenswrapper[4834]: I1008 23:47:17.695565 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npg7j_f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe/extract-content/0.log" Oct 08 23:47:17 crc kubenswrapper[4834]: I1008 23:47:17.740126 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npg7j_f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe/extract-content/0.log" Oct 08 23:47:17 crc kubenswrapper[4834]: I1008 23:47:17.892212 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npg7j_f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe/extract-content/0.log" Oct 08 23:47:17 crc kubenswrapper[4834]: I1008 23:47:17.898922 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npg7j_f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe/extract-utilities/0.log" Oct 08 23:47:18 crc kubenswrapper[4834]: I1008 23:47:18.111721 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wvvps_f2fd386d-6b62-4282-9d8b-7325d579e8cb/extract-utilities/0.log" Oct 08 23:47:18 crc kubenswrapper[4834]: I1008 23:47:18.284159 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wvvps_f2fd386d-6b62-4282-9d8b-7325d579e8cb/extract-content/0.log" Oct 08 23:47:18 crc kubenswrapper[4834]: I1008 23:47:18.337778 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wvvps_f2fd386d-6b62-4282-9d8b-7325d579e8cb/extract-utilities/0.log" Oct 08 23:47:18 crc kubenswrapper[4834]: I1008 23:47:18.337856 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wvvps_f2fd386d-6b62-4282-9d8b-7325d579e8cb/extract-content/0.log" Oct 08 23:47:18 crc kubenswrapper[4834]: I1008 23:47:18.526667 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wvvps_f2fd386d-6b62-4282-9d8b-7325d579e8cb/extract-content/0.log" Oct 08 23:47:18 crc kubenswrapper[4834]: I1008 23:47:18.527687 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wvvps_f2fd386d-6b62-4282-9d8b-7325d579e8cb/extract-utilities/0.log" Oct 08 23:47:18 crc kubenswrapper[4834]: I1008 23:47:18.554273 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-npg7j_f137d3dc-1cb4-4d3b-a1a6-4c918ef2ddfe/registry-server/0.log" Oct 08 23:47:18 crc kubenswrapper[4834]: I1008 23:47:18.774346 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc_c6c7aec8-9631-4ba0-87af-691c252bd8f1/util/0.log" Oct 08 23:47:18 crc kubenswrapper[4834]: I1008 23:47:18.968419 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc_c6c7aec8-9631-4ba0-87af-691c252bd8f1/util/0.log" Oct 08 23:47:18 crc kubenswrapper[4834]: I1008 23:47:18.983216 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc_c6c7aec8-9631-4ba0-87af-691c252bd8f1/pull/0.log" Oct 08 23:47:19 crc kubenswrapper[4834]: I1008 23:47:19.065240 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc_c6c7aec8-9631-4ba0-87af-691c252bd8f1/pull/0.log" Oct 08 23:47:19 crc kubenswrapper[4834]: I1008 23:47:19.202942 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wvvps_f2fd386d-6b62-4282-9d8b-7325d579e8cb/registry-server/0.log" Oct 08 23:47:19 crc kubenswrapper[4834]: I1008 23:47:19.282878 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc_c6c7aec8-9631-4ba0-87af-691c252bd8f1/extract/0.log" Oct 08 23:47:19 crc kubenswrapper[4834]: I1008 23:47:19.287529 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc_c6c7aec8-9631-4ba0-87af-691c252bd8f1/pull/0.log" Oct 08 23:47:19 crc kubenswrapper[4834]: I1008 23:47:19.305430 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cxhztc_c6c7aec8-9631-4ba0-87af-691c252bd8f1/util/0.log" Oct 08 23:47:19 crc kubenswrapper[4834]: I1008 23:47:19.464471 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kvxj2_209dd6f3-8823-4e04-8e83-100706400bc8/marketplace-operator/0.log" Oct 08 23:47:19 crc kubenswrapper[4834]: I1008 23:47:19.518762 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q6hqr_057cb81c-17fd-4e22-8098-ea576e358559/extract-utilities/0.log" Oct 08 23:47:19 crc kubenswrapper[4834]: I1008 23:47:19.683247 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q6hqr_057cb81c-17fd-4e22-8098-ea576e358559/extract-content/0.log" Oct 08 23:47:19 crc kubenswrapper[4834]: I1008 23:47:19.686304 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q6hqr_057cb81c-17fd-4e22-8098-ea576e358559/extract-content/0.log" Oct 08 23:47:19 crc kubenswrapper[4834]: I1008 23:47:19.693284 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q6hqr_057cb81c-17fd-4e22-8098-ea576e358559/extract-utilities/0.log" Oct 08 23:47:19 crc kubenswrapper[4834]: I1008 23:47:19.886089 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q6hqr_057cb81c-17fd-4e22-8098-ea576e358559/extract-content/0.log" Oct 08 23:47:19 crc kubenswrapper[4834]: I1008 23:47:19.897986 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q6hqr_057cb81c-17fd-4e22-8098-ea576e358559/extract-utilities/0.log" Oct 08 23:47:19 crc kubenswrapper[4834]: I1008 23:47:19.906647 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sg4bs_3482aeed-66bd-4fe3-81d6-c12cdab7f9d9/extract-utilities/0.log" Oct 08 23:47:20 crc kubenswrapper[4834]: I1008 23:47:20.042380 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-q6hqr_057cb81c-17fd-4e22-8098-ea576e358559/registry-server/0.log" Oct 08 23:47:20 crc kubenswrapper[4834]: I1008 23:47:20.057317 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sg4bs_3482aeed-66bd-4fe3-81d6-c12cdab7f9d9/extract-content/0.log" Oct 08 23:47:20 crc kubenswrapper[4834]: I1008 23:47:20.087731 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sg4bs_3482aeed-66bd-4fe3-81d6-c12cdab7f9d9/extract-content/0.log" Oct 08 23:47:20 crc kubenswrapper[4834]: I1008 23:47:20.109341 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sg4bs_3482aeed-66bd-4fe3-81d6-c12cdab7f9d9/extract-utilities/0.log" Oct 08 23:47:20 crc kubenswrapper[4834]: I1008 23:47:20.299312 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sg4bs_3482aeed-66bd-4fe3-81d6-c12cdab7f9d9/extract-content/0.log" Oct 08 23:47:20 crc kubenswrapper[4834]: I1008 23:47:20.303273 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sg4bs_3482aeed-66bd-4fe3-81d6-c12cdab7f9d9/extract-utilities/0.log" Oct 08 23:47:20 crc kubenswrapper[4834]: I1008 23:47:20.821503 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sg4bs_3482aeed-66bd-4fe3-81d6-c12cdab7f9d9/registry-server/0.log" Oct 08 23:47:26 crc kubenswrapper[4834]: I1008 23:47:26.556623 4834 scope.go:117] "RemoveContainer" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" Oct 08 23:47:26 crc kubenswrapper[4834]: E1008 23:47:26.557627 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:47:39 crc kubenswrapper[4834]: I1008 23:47:39.565180 4834 scope.go:117] "RemoveContainer" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" Oct 08 23:47:39 crc kubenswrapper[4834]: E1008 23:47:39.568427 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:47:50 crc kubenswrapper[4834]: I1008 23:47:50.555446 4834 scope.go:117] "RemoveContainer" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" Oct 08 23:47:50 crc kubenswrapper[4834]: E1008 23:47:50.556122 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:48:04 crc kubenswrapper[4834]: I1008 23:48:04.556049 4834 scope.go:117] "RemoveContainer" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" Oct 08 23:48:04 crc kubenswrapper[4834]: E1008 23:48:04.556994 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:48:16 crc kubenswrapper[4834]: I1008 23:48:16.555282 4834 scope.go:117] "RemoveContainer" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" Oct 08 23:48:16 crc kubenswrapper[4834]: E1008 23:48:16.555921 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:48:23 crc kubenswrapper[4834]: I1008 23:48:23.509335 4834 generic.go:334] "Generic (PLEG): container finished" podID="c91f0734-c31f-4220-b58b-c3b8d9cf18e9" containerID="61f07e9d0ac79819d100b0054d8b842996edd56953423e025ef74becc9ab9424" exitCode=0 Oct 08 23:48:23 crc kubenswrapper[4834]: I1008 23:48:23.509388 4834 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qbpqz/must-gather-mlvts" event={"ID":"c91f0734-c31f-4220-b58b-c3b8d9cf18e9","Type":"ContainerDied","Data":"61f07e9d0ac79819d100b0054d8b842996edd56953423e025ef74becc9ab9424"} Oct 08 23:48:23 crc kubenswrapper[4834]: I1008 23:48:23.510737 4834 scope.go:117] "RemoveContainer" containerID="61f07e9d0ac79819d100b0054d8b842996edd56953423e025ef74becc9ab9424" Oct 08 23:48:24 crc kubenswrapper[4834]: I1008 23:48:24.410864 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qbpqz_must-gather-mlvts_c91f0734-c31f-4220-b58b-c3b8d9cf18e9/gather/0.log" Oct 08 23:48:31 crc kubenswrapper[4834]: I1008 23:48:31.555981 4834 scope.go:117] "RemoveContainer" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" Oct 08 23:48:31 crc kubenswrapper[4834]: E1008 23:48:31.557075 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:48:31 crc kubenswrapper[4834]: I1008 23:48:31.762088 4834 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qbpqz/must-gather-mlvts"] Oct 08 23:48:31 crc kubenswrapper[4834]: I1008 23:48:31.762473 4834 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qbpqz/must-gather-mlvts" podUID="c91f0734-c31f-4220-b58b-c3b8d9cf18e9" containerName="copy" containerID="cri-o://b714ef58b7c531bf2601afd0fce1b3340b0ed77e01b2a90701b95d0ad62d18d0" gracePeriod=2 Oct 08 23:48:31 crc kubenswrapper[4834]: I1008 23:48:31.771565 4834 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qbpqz/must-gather-mlvts"] Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.199636 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qbpqz_must-gather-mlvts_c91f0734-c31f-4220-b58b-c3b8d9cf18e9/copy/0.log" Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.200352 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbpqz/must-gather-mlvts" Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.355544 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c91f0734-c31f-4220-b58b-c3b8d9cf18e9-must-gather-output\") pod \"c91f0734-c31f-4220-b58b-c3b8d9cf18e9\" (UID: \"c91f0734-c31f-4220-b58b-c3b8d9cf18e9\") " Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.356162 4834 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjhnv\" (UniqueName: \"kubernetes.io/projected/c91f0734-c31f-4220-b58b-c3b8d9cf18e9-kube-api-access-bjhnv\") pod \"c91f0734-c31f-4220-b58b-c3b8d9cf18e9\" (UID: \"c91f0734-c31f-4220-b58b-c3b8d9cf18e9\") " Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.362055 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c91f0734-c31f-4220-b58b-c3b8d9cf18e9-kube-api-access-bjhnv" (OuterVolumeSpecName: "kube-api-access-bjhnv") pod "c91f0734-c31f-4220-b58b-c3b8d9cf18e9" (UID: "c91f0734-c31f-4220-b58b-c3b8d9cf18e9"). InnerVolumeSpecName "kube-api-access-bjhnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.458002 4834 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjhnv\" (UniqueName: \"kubernetes.io/projected/c91f0734-c31f-4220-b58b-c3b8d9cf18e9-kube-api-access-bjhnv\") on node \"crc\" DevicePath \"\"" Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.472521 4834 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c91f0734-c31f-4220-b58b-c3b8d9cf18e9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c91f0734-c31f-4220-b58b-c3b8d9cf18e9" (UID: "c91f0734-c31f-4220-b58b-c3b8d9cf18e9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.559118 4834 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c91f0734-c31f-4220-b58b-c3b8d9cf18e9-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.598656 4834 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qbpqz_must-gather-mlvts_c91f0734-c31f-4220-b58b-c3b8d9cf18e9/copy/0.log" Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.599135 4834 generic.go:334] "Generic (PLEG): container finished" podID="c91f0734-c31f-4220-b58b-c3b8d9cf18e9" containerID="b714ef58b7c531bf2601afd0fce1b3340b0ed77e01b2a90701b95d0ad62d18d0" exitCode=143 Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.599200 4834 scope.go:117] "RemoveContainer" containerID="b714ef58b7c531bf2601afd0fce1b3340b0ed77e01b2a90701b95d0ad62d18d0" Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.599216 4834 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qbpqz/must-gather-mlvts" Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.618082 4834 scope.go:117] "RemoveContainer" containerID="61f07e9d0ac79819d100b0054d8b842996edd56953423e025ef74becc9ab9424" Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.662115 4834 scope.go:117] "RemoveContainer" containerID="b714ef58b7c531bf2601afd0fce1b3340b0ed77e01b2a90701b95d0ad62d18d0" Oct 08 23:48:32 crc kubenswrapper[4834]: E1008 23:48:32.663140 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b714ef58b7c531bf2601afd0fce1b3340b0ed77e01b2a90701b95d0ad62d18d0\": container with ID starting with b714ef58b7c531bf2601afd0fce1b3340b0ed77e01b2a90701b95d0ad62d18d0 not found: ID does not exist" containerID="b714ef58b7c531bf2601afd0fce1b3340b0ed77e01b2a90701b95d0ad62d18d0" Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.663189 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b714ef58b7c531bf2601afd0fce1b3340b0ed77e01b2a90701b95d0ad62d18d0"} err="failed to get container status \"b714ef58b7c531bf2601afd0fce1b3340b0ed77e01b2a90701b95d0ad62d18d0\": rpc error: code = NotFound desc = could not find container \"b714ef58b7c531bf2601afd0fce1b3340b0ed77e01b2a90701b95d0ad62d18d0\": container with ID starting with b714ef58b7c531bf2601afd0fce1b3340b0ed77e01b2a90701b95d0ad62d18d0 not found: ID does not exist" Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.663209 4834 scope.go:117] "RemoveContainer" containerID="61f07e9d0ac79819d100b0054d8b842996edd56953423e025ef74becc9ab9424" Oct 08 23:48:32 crc kubenswrapper[4834]: E1008 23:48:32.663506 4834 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61f07e9d0ac79819d100b0054d8b842996edd56953423e025ef74becc9ab9424\": container with ID starting with 61f07e9d0ac79819d100b0054d8b842996edd56953423e025ef74becc9ab9424 not found: ID does not exist" containerID="61f07e9d0ac79819d100b0054d8b842996edd56953423e025ef74becc9ab9424" Oct 08 23:48:32 crc kubenswrapper[4834]: I1008 23:48:32.663527 4834 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f07e9d0ac79819d100b0054d8b842996edd56953423e025ef74becc9ab9424"} err="failed to get container status \"61f07e9d0ac79819d100b0054d8b842996edd56953423e025ef74becc9ab9424\": rpc error: code = NotFound desc = could not find container \"61f07e9d0ac79819d100b0054d8b842996edd56953423e025ef74becc9ab9424\": container with ID starting with 61f07e9d0ac79819d100b0054d8b842996edd56953423e025ef74becc9ab9424 not found: ID does not exist" Oct 08 23:48:33 crc kubenswrapper[4834]: I1008 23:48:33.567063 4834 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c91f0734-c31f-4220-b58b-c3b8d9cf18e9" path="/var/lib/kubelet/pods/c91f0734-c31f-4220-b58b-c3b8d9cf18e9/volumes" Oct 08 23:48:42 crc kubenswrapper[4834]: I1008 23:48:42.555840 4834 scope.go:117] "RemoveContainer" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" Oct 08 23:48:42 crc kubenswrapper[4834]: E1008 23:48:42.556755 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:48:57 crc kubenswrapper[4834]: I1008 23:48:57.556441 4834 scope.go:117] "RemoveContainer" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" Oct 08 23:48:57 crc kubenswrapper[4834]: E1008 23:48:57.557774 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:49:08 crc kubenswrapper[4834]: I1008 23:49:08.556455 4834 scope.go:117] "RemoveContainer" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" Oct 08 23:49:08 crc kubenswrapper[4834]: E1008 23:49:08.557649 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:49:22 crc kubenswrapper[4834]: I1008 23:49:22.556040 4834 scope.go:117] "RemoveContainer" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" Oct 08 23:49:22 crc kubenswrapper[4834]: E1008 23:49:22.557203 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2" Oct 08 23:49:37 crc kubenswrapper[4834]: I1008 23:49:37.555929 4834 scope.go:117] "RemoveContainer" containerID="36936c038ea6cb9b590a7ad018de8061369ec9a1bf292cf5bd5d2d7149dfd6fd" Oct 08 23:49:37 crc kubenswrapper[4834]: E1008 23:49:37.558939 4834 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f9m4z_openshift-machine-config-operator(732cf917-b3ec-4649-99b0-66653902cfc2)\"" pod="openshift-machine-config-operator/machine-config-daemon-f9m4z" podUID="732cf917-b3ec-4649-99b0-66653902cfc2"